Barry Phillips, Chief Marketing Officer and Vice President of Sales at Wanova had problems trying to comment on Why hasn't desktop virtualization taken over the world?. It appears that ZDnet's comment system wouldn't accept his comment for some reason. After telling him that "the dog must have eaten it," I promised to make is comment a central theme of today's post.
Unfortunately there is confusion with Desktop Virtualization (DV). Most people think that VDI and DV are synonymous. VDI is one small use case of DV and is for static images over a high-speed connection to a thin client device. VDI is not a solution for the 600 million business PCs. For DV to be successful, it has to provide centralized image management that IT needs but enable the image to run local on the PC.
Barry has a really good point, a point that I'd like to expand upon in this article. A whole section of my book "Virtualization: A Manager's Guide" is devoted to the topic of Desktop Virtualization. If you don't mind (and even if you do), I'd like to included a snippet of the book to support what Barry is saying.
Desktop virtualization is the use of several virtualization technologies, either together or separately. Let’s look at each of these cases in turn.
- When “desktop virtualization” is used to describe making it possible for people to access a physical or virtual system remotely, access virtualization technology is used to capture the user interface portion of an application. It is then converted to a neutral format and projected across the network to a device that can display the user interface and allow the user to enter and access information. This means that just about any type of network-enabled device could be used to access the application. Suppliers such as Citrix, Microsoft, and VMware offer client software for tablets, smartphones, laptops, and PC, making it possible for users of those devices to access the applications running elsewhere on the network.
- When “desktop virtualization” is used to describe encapsulating an application using client-side application virtualization technology and then projecting it in whole or piecemeal to a remote system for execution, the application could either remain on that client device or be deleted once the user completes the task, depending on the settings used by the IT administrator. This means, of course, that the client system has to run the operating system needed by the application. So, Windows applications, for example, would need to run on Windows executing on a PC or laptop.
- When “desktop virtualization” is used to describe encapsulating the entire stack of software that runs on a client system, the phase starts to take on a great deal of complexity. That encapsulated virtual client system becomes highly mobile. Here are the possibilities:
- One or more virtual client systems could execute on a single physical client system. This allows personal applications to run side by side with locked-down corporate applications.
- Local execution. Virtual client systems could run on a local blade server. The user interface is projected to physical PCs, laptops, or thin client systems using access virtualization technology.
- Remote execution. Virtual client systems could run on a server that resides in the organization’s data center. The user interface is projected to physical PCs, laptops, or thin client systems using access virtualization technology. Since the industry is using the same phrase to describe all of these different approaches, the concept of desktop virtualization can be quite confusing to those unfamiliar with all of the different types of technology that could be pressed into service.
Thanks Barry for helping me expand the discussion of the concept of Desktop Virtualization.
Wanova, by the way, provides a "Desktop Cloud" solution that centralizes PC images in the network similar to VDI, but enables a copy of the image to run locally on a laptop or desktop so users can take advantage of the native performance of a PC. This, by the way, includes the ability to run multimedia applications and work while disconnected from the network.
The company would point out that this approach allows organizations to maintain a single copy of operating systems, applications, etc., and even enables organizations to back up of the entire end-point image, not just the data. IT administrators can then return a system back to the exact state when "something bad has happened."