Since attending the Parallel’s Summit 2009 back in January I’ve been trying to listen out for virtualisation news more intently than ever. Although many companies are jumping on the bandwagon, it is the processor builders at AMD and Intel that seem to have cornered the market on snappy news angles as they talk about parallelism and concurrency in the context of server consolidation.
For themselves, Parallels appears to be selling heavily on its ‘Operations Automation’ offering, which is operations support system (OSS) for service providers who need to automate the delivery of server-based apps and other resources.
But now, I’m wondering if virtualisation is getting even more sophisticated as it spreads to every available corner of the enterprise. What may have started at the server level, has been rapidly spreading towards the desktop in the form of virtualised application delivery – and now may be headed for the high-end graphics-intensive workstation.
At least that is what the latest announcements appear to suggest. Late last night I saw news of Parallels Workstation Extreme, a ‘solution’ that the company says offers “near-native performance for resource-intensive applications”.
This then, if it works, is grown up virtualisation where support for huge memory, use of multiple CPU cores and direct access to graphics cards is not a problem as users run multiple operating systems on the same physical box.
Unsurprisingly perhaps, Parallels worked with Intel, NVIDIA and HP on its new product, which CEO Serguei Beloussov describes as suitable for users of resource-intensive applications, such as those used for oil and gas exploration.
In his official press statement Beloussov detailed the specs, “Parallels Workstation Extreme offers users support for 3D professional graphics cards via Intel VT-d and the new NVIDIA SLI Multi-OS technology, delivering near-native performance. The solution also offers up to 16 CPU cores and 64GB of RAM for guest Oss.”
I contacted Jon Collins at analyst house Freeform Dynamics for his opinion on this development as just this morning I had been reading his Virtualisation – The State of Play presentation on the web.
Collins told me that, “Breaking the emulation deadlock and enabling more of a ‘pass-through’ approach is the way that virtualisation is going in general, so this news makes sense. However while high-end graphics is an important feature for virtualisation, this is not just for high-end applications. A lack of usable graphics capability imposes a constraint which can get in the way when putting together the business case – so this should be seen as much as the removal of a bottleneck, as providing a higher-order capability for specific application types.”
Where next for virtualisation then? Well, you might like to follow that argument that suggests that the virtual machine will grow in power and dominance and start to rule other computing domains. New compute structures may then emerge as the very fabric of the IT ecosystem morphs to shape itself around the virtual data centre. Will this be a bad thing? Many would say perhaps not.