X
Business

What's holding Desktop Virtualization back?

Server virtualization appears to be gaining acceptance in even the most conservative organizations that deploy industry standard systems. It long has been a fixture in mainframe and midrange system based datacenters.
Written by Dan Kusnetzky, Contributor

Server virtualization appears to be gaining acceptance in even the most conservative organizations that deploy industry standard systems. It long has been a fixture in mainframe and midrange system based datacenters. Desktop virtualization, on the other hand, seems to be considered a technology looking for a problem to solve. Why is that? Depending upon which layer of the Kusnetzky Group model of virtualization software being considered there are quite a number of benefits that could quickly be realized.

Inhibitors to the adoption of desktop virtualization technologies

There are a number of things that appear to be inhibiting the adoption of virtualization technology. The following bullets briefly review some of them.

  • Change - IT organizations are risk and change averse due to the fact that it is their job to keep things running in datacenters that look a bit like computer museums. Any change can result in a cascade of technical problems. Furthermore, adding this technology could also mean deploying new desktop hardware or PC blade solutions back in the datacenter.
  • People issues - convincing all of the stakeholders that adding one more technology to the already precarious stack is a worthwhile effort is challenging. Furthermore, adding something new also brings on conflicts over who will own and manage this new technology.
  • Cost - Adding a new technology adds costs over what is currently being done. Virtualization technology is no different. IT organizations are often being measured based upon the "what have you done for me lately" school of financial management. Long term benefits may be ignored due to a strong focus on short term cost cutting policies.
  • Complexity - Organizational IT infrastructures are already a complex mix of technologies that require a number of levels of expertise. Adding something new also brings with it the requirement to add expertise in that new technology.
  • Licensing issues - IT decision makers are doing their best to comply with licensing requirements. This is a full time job all by itself. Each product, each layer of technology brings with it a set of terms and conditions that may not be totally alligned with those found at other layers. For example, just because it is possible to encapsulate Windows and run multiple copies on a single machine or run it back in the datacenter and project the user interface out to a remote device, doesn't mean that the organization automatically has the right to use Windows in that way. Once the issues with the operating system are found and resolved, then it is necessary to go up the stack and look into the rules for each layer of software. It may simply be too much work for some organizations at this point in time.

What should the industry do?

It is clear that we're in the early stages of adoption and that many different types of technology are all being described using the same catch phrases and buzzwords. The industry as a whole needs to simply its messaging.

Licensing is a pain. The industry needs to make this easier to understand. It also needs to make it easier to comply.

I'm tracking six different layers of virtualization technology. Each layer often has multiple types of products. Even though I'm doing this full time, I still can't keep up with everything that is happening everywhere. How could a busy IT decision maker keep up? Education is a constant need.

What do you think would accelerate the adoption of desktop virtualization technologies?

Editorial standards