X
Business

Virtual machine software is only the beginning

Virtualization is a very useful tool. Unfortunately, many mistakenly equate virtual machine technology with virtualization itself. In reality, virtualization is a well-established group of technologies with a long track record of success. Using them together is where the real benefits may be found.
Written by Dan Kusnetzky, Contributor

Virtualization - from mainframes to industry standard systems

Virtualization is a very useful tool that allows system resources to be utilized in new ways. Unfortunately, many people mistakenly equate virtual machine technology, technology that can encapsulate all of the software that runs on a physical system and allows that capsule to run along side of others on a single host system, with virtualization itself. In reality, virtualization is a well-established group of technologies with a long track record of success in data centers around the world. Suppliers, such as HP, IBM, Intel, and Oracle, have been involved with all levels of virtualization technology for a very long time. IBM, for example has been offering virtualization technology since the late 1960s.

The narrow view of virtualization as being merely virtual machine software used to support virtual desktops and virtual servers has begun to get in the way of pushing the use of this set of technologies forward. I believe it is about time to think of virtualization much more broadly. To be truly efficient, a workload is likely to need access, application, processing, network, and storage virtualization.

I’m happy to see that IBM, VMware, Citrix and a few others are working to expand the industry's view of this useful technology.

The What and Why of Virtualization

Virtualization is the use of hardware and software technology to present a logical view of resources. Usually, this means trading excess processing power, memory, network and storage capacity to create a more useful, albeit artificial, environment.

This logical view is often strikingly different than the actual physical view. What does this really mean? System users may see the image of many different computers even though it is a single system. They may see many individual systems as a single computing resource. In some cases, this means that a workload invokes the processing power of literally thousands of physical computers. Individuals can be allowed to access computing solutions with devices that didn't exist when developers created the applications. Applications may appear to use devices that have long been considered obsolete even though none are actually present.

As one might expect, adding layers of software between the application and the underlying physical system could affect the performance of entire systems or individual components. After all, the underlying systems are doing more work to support this logical or virtualized view to developers and users. AMD, IBM, Intel and other suppliers have been investing heavily in technology that would ease that burden and provide levels of performance in virtualized systems that closely approximates the performance of a physical system.

Why Virtualization?

With overall server utilization in many workload environments under 20%, the key to adoption of virtual machine software boils down organizations wanting to make the most of their available technology assets, increase their levels of efficiency and agility and improve their ability to provide products and services to customers. In the end, organizations are seeking ways to use technology to both increase their revenues and lower IT costs. How does it do this?

  • Virtualization can present the image that solutions never slow down or fail by deploying redundant systems. It can optimize the use of systems moment-by-moment throughout the day. Costs of hardware and software can be reduced in a virtual environment without letting staff-related costs go through the roof.
  • Organizations can consolidate many independent applications on a single system to make full use of its processing power. They can also use virtualization to spread work over many systems to achieve levels of scalability or performance that once required dedicated systems or was simply unheard of just a few years ago.
  • One of the areas offering the largest immediate return on investment is using virtualization technology to manage complete environments as a single domain even though they’re really made up of a diverse collection of individual systems.
  • Systems, storage, networking and other resources can be used as a large shared pool to make maximum use of of these resources and reduce or eliminate over provisioning.

Virtualization - mainstream tools

There are many layers of technology that virtualize some portion of a computing environment. Each of these tools can be applied to making industry standard systems part of a larger, more efficient, more productive computing environment. It is wise to consider using these technologies together to create a more efficient, flexible, agile environment.

Let’s quickly review how each type of virtualization helps organizations.

  • Access Virtualization makes it possible for nearly any type of device to be used to access nearly any type of application over just about any type of network. Using this technology, developers aren’t forced to change applications to allow individuals get things done using a hand held device, a thin client, a laptop computer or even a desktop system. IBM was among the first to offer this form of technology for its mainframes decades ago. Microsoft and Citrix have been proponents of this type of technology on industry standard systems since the 1990s.  VMware added this type of technology to its portfolio in the 2000s.
  • Application virtualization creates a protected environment that makes it possible to automatically restart an application in case of a failure, start another instance of an application if the application is not meeting service level objectives, or provide workload balancing among multiple instances of an application. IBM has offered this type of technology for its entire portfolio of systems since the 1970s. Industry standard systems have seen the benefit of this technology coming from many suppliers including Citrix, Microsoft, VMware, AppZero, triCerat and a number of others.
  • Processing virtualization hides the physical hardware configuration and makes it possible, on the one hand, to present a single system as if it were many or, on the other hand, to present many systems as if it were a single resource. IBM and Intel have been pushing the state of the art in this area for quite some time. Citrix, VMware and Microsoft joined this party in the late 1990s.
  • Storage virtualization presents a logical view of storage that allows many systems to share a single storage resource that’s located on the network. It may also be used to make many storage resources appear to be a single resource to simplify use and provide a high level of storage optimization. Companies such as EMC, NetApp, HP, Hitachi and IBM are the players in this area.
  • Network virtualization presents a logical view of network resources that is secure and managed. Companies such as Cisco, Dell, IBM, HP, Juniper and VMware ,have offerings in this area.
  • Management and security software makes it possible administrators to treat many systems as a single computing resource. IBM, HP, CA, RSA and BMC offer technology in this area.

These layers of technology are not new to the industry. Many suppliers have been working to bring these layers of technology to organizations needing flexible, powerful computing environments.

Suggestions for selecting platforms and virtualization technology

It is important to have a clear picture of the organizations goals before selecting a specific type of virtualization technology or hardware platform. Depending upon the organization’s requirements and goals, different technologies come to the forefront. Organizations often seek higher levels of performance, greater agility, and increased scalability, consolidation of many workloads onto a smaller number of physical systems or creating a unified management domain.

Regardless of the organizations’ goal or goals for the use of virtualization technology, it wise to select a platform that has the biggest “ecosystem.” That is the platform that is supported by the largest number suppliers. Organizations’ would be wise to consider offerings that support the broadest set of systems, data management software, development tool software, virtualization software, application software and management software rather than simply going along single vendor offerings.

This focus on a common hardware architecture that offers hardware assists for virtualization technology will, in the end, reduce the costs of hardware acquisition while still offering the organization the ability to track performance improvements over time.

What does the future hold

Suppliers of both hardware and software are focusing a great deal of investment on virtualization at all levels of the model. This includes suppliers of systems, operating system software, data management software, application development as well as the suppliers of application development framework software in order to offer organizations a highly optimized set of virtualization solutions at the lowest possible cost. Through the efforts of AMD, Intel, IBM and others to increase virtualization optimization and decrease power consumption, IT managers have the capability to increase their overall system utilization while decreasing costs by 50% or greater. Here are a few of the likely improvements virtualization technology will provide in the near future.

  • Optimal use of an organizations systems will be assured because applications, application components and data will be moved to the most appropriate environment on a moment by moment basis
  • Organizations will find it much easier to add processing power as needed to meet their own service level objectives
  • New technology will co-exist and work efficiently with more established technology.
  • Applications will be accessible from nearly any type of network-enabled device, over just about any network, form nearly anywhere without organizations being forced to re-implement or re-design their applications
  • Application performance, scalability and reliability will increasingly be built into the environment rather than being a matter of tedious or complex design
  • Applications and data will be increasingly secure and protected thus removing the fear IT management has of security breaches, malicious Email messages and the like.
  • Individual software developers will no longer have to care which system is working for them, where it is located or what type of software is supporting them. They’ll be able to focus on the task at hand rather than being asked to take on the role of system operators.
  • Access virtualization is likely to be increasingly used to simplify access to applications and data from traditional PCs as well as properly configured Smartphones, Tablets and other intelligent network enabled devices.
  • Application virtualization will be increasingly considered as a way to deliver applications to compatible systems and deal with version incompatibilities.
  • Other aspects of processing virtualization, such as technology to support parallel processing, will be utilized beyond technical or high performance computing.
  • Storage virtualization technology will be increasingly deployed to increase storage performance while also lowering the data center space required for storage and reducing both the power consumption and heat production of storage devices.

With few exceptions, it is expected that IT solutions will live in a virtual world.

Editorial standards