Breaking through the virtualization confusion

Many vendors have embraced the idea of helping organizations work with a logical view of computing resources rather than a physical view. Unfortunately, each has chosen a different layer or set of layers as their focus and yet, they still call their product “"virtualization.

Many vendors have embraced the idea of helping organizations work with a logical view of computing resources rather than a physical view. Unfortunately, each has chosen a different layer or set of layers as their focus and yet, they still call their product “"virtualization." Maybe that's because "virtualization" has become a powerful buzz word like "client/server computing" or "Web 2.0" is today.

My clients tell me that they’'re confused and don'’t always know which way to go. This, of course, is a wonderful situation for a consultant. So, I’d like to extend my appreciation to everyone who has done their best to confuse the market in the hopes of keeping IT decision-makers from discovering competitors.

Since there are virtualization technologies that address some combination of access virtualization, application virtualization, virtual processing, storage virtualization, virtual networks, and management/provisioning of virtual resources, it’s easy to understand why decision-makers are not sure where and how to apply this technology. Each of these approaches can offer demonstrable value to the organization but, I’'m not sure product could really live up to the hype in every situation.

Although some vendors promise the sun and the stars, it’s much more likely that the customer is just going to be mooned if they go forward without planning and selecting technology to fit a well-defined architecture rather than simply purchasing point products to relieve the pressure in each area.

Do you believe that installing any single product is going to automatically result in reductions in hardware, software and staff-related costs? I have my doubts that moving forward with any single technology without fully understanding the ramifications will automatically result in anything other than vendor enrichment. I'm reminded of something attributed to Laurence J. Peter "If you don't know where you're going, you will probably end up somewhere else." Before embarking on a journey to a virtualized environment, it would be very wise to have solid answers to the following questions.

  • What does the organization really need?
  • Is it willing to abandon established technology and move to something new or must it evolve carefully and slowly in the direction of a more virtualized environment.
  • Does it have all of the necessary levels of experience and expertise to make the best use of virtualization technologies?
  • Does it make sense to carefully architect an overall solution or does it make more sense to take problems one at a time.

Depending upon the goal, virtualization technologies might best be used in conjunction with one another. Here are some examples.

  • Consolidation — access, processing, storage and network virtualization technology can all play a role when the goal is consolidation. It would also be wise to consider the addition of some techbology to increase the reliability and availability of the consolidated computing solutions as well. It’s rather embarrassing to consolidate everything down to a few servers and then have them go off line for some reason.
  • Performance — application, processing and storage virtualization are usually deployed when the organization’s goal is raw performance. Since this type of environment usually is quite complex, it would be wise to also select some form of management and provisioning software to keep costs of installation, administration and operations in line.
  • Scalability — application, processing and storage virtualization technology are typically selected when scalability is the goal. As with the goal of raw performance, it would be very wise to include management and provisioning tools in the overall architecture.
  • Agility — depending on the type of applications, platforms and operating systems used in the organization, a mix of all of the virtualization technologies is likely to be needed.
  • Availability/reliability — making sure that individuals accessing the organization’s computing solutions never see a failure usually requires the use of application, processing and storage virtualization technology. Suppliers of management and provisioning software will often say that their products, in conjunction with virtual machine technology, can do it all. I doubt this is true if mainframes, midrange machines and older single-vendor environments are part of the picture.
  • Unified management domain— this requirement ought to be at the top of the list because this technology can often produce significant cost reductions quickly but, unfortunately, many include thoughts of this type of virtualization technology in their initial planning.

Although point solutions can be helpful in the short term, it would be wise for an organization’s IT department to develop and overall architecture before selecting products. A well designed architecture will make sure that there’s room for growth, room for new technology to be implemented and won’t lead to project failures.

    Does your organization work within a well-established architecture when selecting virtualization technologies? Who designed the architecture?