X
Business

What is "Virtualization 2.0?"

I've been asked to moderate a panel at an upcoming event. The topic of the panel is "Virtualization 2.
Written by Dan Kusnetzky, Contributor

I've been asked to moderate a panel at an upcoming event. The topic of the panel is "Virtualization 2.0." This is a topic that my former colleagues at IDC have been flogging for quite a while. They even have a series of successful conferences on the topic.

What does this really mean? Is this something really new or merely incremental improvements in the technology used in industry standard datacenters that comes from things that the mainframe and midrange folks have been doing for decades? More of the latter than the former in my view.

What is "Virtualization 2.0?"

Virtualization 2.0 is a catch phrase developed by several industry research firms in the hopes of somehow distinguishing what they're publishing from what others are publishing. If this is done properly, IT decision-makers will think that they're offering leading-edge thinking and will select their research services over those offered by others even though those others may have a deeper well of knowledge and expertise but, have chosen to use different terms to describe their research.

The research firms using this catch phrase would define Virtualization 1.0 as using technology to place computing resources in an environment that hides and simplifies the view those functions have of the actual physical environment. This "logical" or "virtual" environment offers enhanced features from those found in the actual physical environment.

They would then go on to say that Virtualization 1.0 was a static environment. That is, once a resource is encapsulated and is assigned to a specific set of physical resources, it doesn't move. and that Virtualization 2.0 arrived when those computing resources could be dynamically moved during processing to offer a more agile, optimal environment.

What's new?

This type of environment first appeared at least 20 years ago in the world of the Mainframe and at least 15 years ago in the world of the midrange system. These research firms have declared a new catch phrase because it is now appearing in the realm of the industry standard system.

While I love the fact that their marketing catch phrase seems to have caught on in some quarters, I question if anything is really new or is it just the progressive realization of the idea that the physical resources are often best utilized when they are hidden from the computing functions that are using them.

What's your take on this? Do you think that something new is really happening here? If so, please let us know what's new that hasn't emerged years ago in other parts of the IT industry.

Editorial standards