X
Business

Virtual computing offers real benefits, real challenges

Virtualisation breaks the link between software and hardware; great for managers but not so good for others
Written by Rupert Goodwins, Contributor
The magic word for today is virtualisation. IBM has launched its Virtualization Engine, and you can't find a tech company on the planet that doesn't pay at least lip service to the idea. Briefly put, with virtualisation you can make one chunk of hardware look like multiple independent machines and thus run two copies of Windows, three of MacOS and a brace of Linuxes all at the same time. That's not what gets IBM excited about it, though: it's keen because you can run lots of servers on one box where previously you were limited to one processor and one lot of software.

At first, this seems too good to be true: if you can run ten servers on the same chunk of hardware that previously managed just the one, will you end up buying 90 percent fewer boxes? Quite the opposite, of course: you never get something for nothing, especially not from a company whose sole purpose is to sell you stuff. If you've got one server running on a computer where every resource is maxed out -- processor gasping for breath, disk heads twitching like a Los Angeles seismograph, memory chocka and networks screaming -- then virtualisation will only make things worse.

However, this is rarely the case. Long before all your resources are full, some component will start to limit throughput and start the law of diminishing returns. A common rule of thumb is that a PC-based server starts to run out of steam after about 30 percent of its resources are engaged: virtualisation can help here by allowing different tasks to use the underutilised resources that would otherwise be hanging around waiting for the overstressed stuff to finish.

Eventually there'll be nothing left to maximise, and here's where the benefits of virtualisation kick in. Say you're running four virtual machines on a single piece of silicon -- four web servers, for the sake of argument -- and demand is such that performance is unacceptable. By moving just one of those servers off to fresh hardware, performance for all will be hugely improved -- without having to touch three of them. By increasing the degree of control that managers have over where and how servers run, virtualisation should mean that you end up just buying the kit you actually need, not the amount you need to compensate for the basic inefficiencies of the PC architecture.

There are plenty of other advantages. A virtual environment is just a mix of software and data, so not only can it be moved from machine to machine, it can be backed up on disk, duplicated, transmitted across the Net and so on -- making it much easier to handle in general. Even for individual users, the ability to run multiple operating systems or move complete set-ups from one computer to another will solve a lot of the headaches we have in our daily digital lives.

Key issues remain -- usability, reliability and manageability -- but these are areas in which IBM has much experience. Assuming these have been addressed, then virtualisation is going to become an essential part of corporate IT -- and remarkably quickly at that. Both IBM and Intel are building hardware support for the idea into their processors, IBM into the Power range and Intel with its as-yet foggily described Vanderpool technology. Nobody wants to limit the market for their ideas, so it's a safe bet that virtualisation will be available on all our desktops sometime soon.

It's not all good, at least not for everybody. The virtualisation abstraction breaks the link between 'a server' -- in fact, the operating system -- and the hardware on which it runs. This is counter to the way that some companies, notably Microsoft, see computers. Load a new Microsoft operating system on a machine and the first thing it does is lock itself down harder than a limpet on a rock. It scans the computer and uses all the details of the hardware it finds to generate a security code to make darn sure it can't be moved onto another machine. But when the hardware it scans is virtual, what good is that? If your licence states that you can only ever run your OS on the computer on which it was first installed, do you give up on virtualisation or do you find an OS with less draconian conditions?

Likewise, Microsoft's digital rights management machine, the Next Generation Secure Computing Base (NGSCB) has at its heart a secretive 'virtual vault' that spends at least some of its time making sure it's not running on a virtual machine. But how can DRM work when the entire environment in which it finds itself can be duplicated and spurted across the net as easily as a Beyonce MPEG?

Microsoft has its own virtualisation products, bought in when it purchased Connectix last year, and it's hawking them as ways to control software deployment across organisations and of running multiple Windows servers on a single Windows box. Since it controls those virtualisation products, it can ensure that they don't break the hardware-reliant licensing models or its DRM thrust -- and thus it can avoid having to tackle those problems at their root. Other companies won't be so happy to play ball, even if Microsoft was feeling in a mood to tell them how to do so.

Virtualisation offers unparalleled freedom in the way we configure and use our hardware. It's a big enough idea that those who don't want to play that way will have problems keeping up, while those who subscribe wholeheartedly will get the biggest benefits.

Editorial standards