Virtualization: hardware and software working together in harmony

Summary:IT decision-makers must consider all of the reasonable alternatives available to them. Mainframes and midrange systems can’t be merely thought of as a legacy of the past.

We've been watching the industry, as a whole, move from a focus on physical system performance, reliability, availability, cost, power consumption or some other attribute, to speaking about how applications, application components and complete workloads can be moved into a more virtual environment. This is a game changer on many levels, in particular in improving IT efficiency, which reduces data center energy usage, which is good for both the organization's pocketbook and the environment.

What is often held in the shadows and seldom discussed is that a virtualized environment is a carefully constructed and managed illusion that requires a well architected balance of capabilities of system processors, memory, internal communications busses, storage devices, networking equipment, as well as a complex layer of software technologies.

Virtualization is a relative newcomer to the industry standard X86 world. While it is growing by leaps and bounds, it is important to acknowledge that nearly all of the concepts we're seeing emerging in the X86 world were developed, tested and, some would say, perfected elsewhere. I'm referring to IBM's System z (commonly known as "the mainframe"). These systems and the supported software continue to drive industry innovation and are often the most efficient and cost-effective way to address large scale workloads.

What Is Virtualization?

Virtualization is a way to abstract applications and their underlying components away from the hardware supporting them and present a logical or virtual view of these resources. This logical view may be strikingly different from the physical view.

Virtualization can create the artificial view that many computers are a single computing resource or what appears to be a single system is really many individual computers working together. It can make a single large storage resource appear to be many smaller ones or make many smaller storage devices appear to be a single device.

This virtual view is constructed using excess processing power, memory, storage, or network bandwidth. For this magic to work smoothly, efficiently and reliability, it is necessary for system architects and developers to find the correct balance of hardware and software functions. If the magic incantation is done just so, high levels of manageability, reliability, availability, and performance are the results. Other important results of virtualization are minimizing requirements for floor space, power or the production of heat. This can mean smaller, faster, more power efficient, more "green" computing for everyone.

Having the correct balance of hardware can be seen at many layers of virtualization technology that is in use today. Let's examine the layers of virtualization technology and how having the right mix of hardware and software can transform virtualization from a computer science project to a technology that can be safely and simply used in a production environment.

Topics: Virtualization


Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He is responsible for research, publications, and operations. Mr. Kusnetzky has been involved with information technology since the late 1970s. Mr. Kusnetzky has been responsible for research operations at the 451 Group; corporate and... Full Bio

Contact Disclosure

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.