According to ExtremeTech.com, last year, at Intel's Professional Developer Forum, Intel's Paul Otellini "played back an episode of 'The Simpsons' on a prototype Vanderpool system while Louis Burns, general manager of Intel's desktop products group, played a 3D game. After shutting down the game, Burns rebooted the partition while the video streamed on uninterrupted."
For destkop users, virtualization could be a boon in situations where applications that routinely misbehave and end up bringing the entire system down (including other applications and important data) can be "quarantined" into a separate partition where they can't do the rest of the system any harm. But the demand for this hardware-based approach to such "security" isn't much to speak of just yet. Up until Vanderpool gets released, the most common approach to virtualizing desktop systems is with products like VMWare's VMWare Workstation. But, judging by the way VMWare targets that product, the only people really interested in such desktop virtualization are software developers (especially Java developers who like to test their wares in multiple environments).
Where virtualization has proven to be more attractive is on the server side -- particularly in situations where server consolidation is a key goal. Not far behind Vanderpool (which, again, is for desktops) is Intel's virtualization technology for servers known as Silvervale. What all this means for companies that have staked their future on software-based virtualization like VMWare (now a division of EMC) remains to be seen. Perhaps people will end up using solutions from VMWare to further divide each hardware-based partition in to multiple software-based partitions thereby turning individual systems into grid super-nodes, or better yet, self-contained grids. Have your own theory? Feel free to chime in on the comments below.