'

100% CPU utilization

Fellow ZDNet blogger, Paul Murphy, warned recently against using virtualization technologies to really max out CPU utilization. The temptation, of course, is to assume that a server running at only 30% of capacity is being wasted.

Fellow ZDNet blogger, Paul Murphy, warned recently against using virtualization technologies to really max out CPU utilization. The temptation, of course, is to assume that a server running at only 30% of capacity is being wasted. The reality is that bumping up against much higher levels of utilization will simply be too painful for users.

A teacher in one of our thin client labs today called me because performance was so dismal. As it turns out, she was having the kids create large, media-rich PowerPoint presentations. A check on system processes showed that the server was at 100% CPU utilization with Office 2007 and web browsers eating up most of the clock cycles. I was able to partially address the problem by moving kids onto different terminal servers, but this is one of those cases where the teacher really should have been using our standalone fat client lab.

There are a few points to take away here.

  • First of all, virtualization and/or terminal services (whether Windows, Linux, or otherwise) are great technologies that can provide desktop-like experiences to a lot of users with vastly reduced administration.
  • Secondly, virtualization is not carte blanche to see just how far you can push server utilization. Just because there are spare clock cycles doesn't mean you should use them.
  • 100% server utilization is not productive in a classroom setting.
  • If you want to virtualize or use terminal servers, buy some serious hardware; you should reap the lower TCO through easier management, not through lower initial cost.
  • For now, serious multimedia is for standalone PCs; we'll see how things shape up as virtualization on the desktop continues to mature over the next 1-2 years.