X
Business

Five reasons desktop virtualization has gotten off to a slow start

Desktop virtualization, arguably, offers better reliability, enhanced security and better management. Why haven't organizations flocked to this new technology?
Written by Dan Kusnetzky, Contributor

John Glendenning, Virtual Computer's Senior Vice President of Worldwide Sales and Business Development, and I had a very interesting conversation about Desktop Virtualization in general and where his company's product, NxTop, is enjoying a great deal of success.

During that discussion we covered quite a bit of territory. We ran out of time long before we ran out of interesting topics to discuss.  Here, in a nutshell, are a few of the reasons that desktop virtualization hasn't simply become the way client systems are deployed everywhere:

  1. Good enough is good enough: (Golden Rule of IT # 4) causes organiztions to stay with the tools they're using even though they're no longer the best way to offer staff members computing tools.
  2. Inertia: This is a variation of #1. IT administrators have learned (through painful experience at times) how to make Windows, Mac OS X and other desktop environments they're using work.  While other approaches might be better, IT administrators believe it would take time to master these tools and make them support the organization's workloads.  Since they're fearful about losing their job to someone somewhere else in the world doing this work remotely, they're going to stay with what they know.
  3. Loss of control: The organization's staff has had control of their own desktop environments since Windows was first launched back in November 1985 (26 years ago), they fear losing control of their own desktop environment.  The heavy handed, one size fits all, way some desktop virtualization solutions have been rolled out have become legend.  Soon general staff and IT are at war.  When this happens, the IT staff usually loses in the end.
  4. New technology not "invisible:" how staff worked with their desktop systems changed significantly when their environments were virtualized.  Sometimes that meant that typical functions, such as cut and past across applications, stopped working. Other times basic functions that staff had come to rely on became either unavailable or worked in some other way.
  5. There are front end costs: organizations have already factored in all of the costs to support their desktops.  Adding something new, no matter how much better it makes the environment, still adds costs on the front end. The people who make the decisions are measured on front end costs, not the overall costs to the organization. That's the reason that some ignore the significant back end savings the use of virtual desktops offer.

I could go on and one, but I think you get the gist of the conversation.

I've been watching Virtual Computer for quite some time, since I stumbled on their "coming out of stealth party" at the Mandalay Bay hotel in Las Vegas in 2007 and got a unexpected tour of their product demos. I think that their strong focus on using virtualization to create a secure, reliable, manageable desktop environments rather than merely focusing on virtualization for its own sake makes a great deal of sense. Furthermore, they've developed ways for locked-down enterprise applications to co-exist with open, personal environments that would resolve many of the issues I've just mentioned.

As with other players in the desktop virtualization space, they are challenged to get the message about how their solution works to everyone and so, the fear, uncertainty and doubt continues.

Editorial standards