As I'm preparing to leave VMworld, I find myself thinking about a question I heard at the event. That question is "what is virtualization?". Living in the virtual world a great deal of the time, it's hard for me to remember when I first came across the term. It was somewhere in the early 1980s while I was doing my best to handle the responsibilities of VAXcluster programs manager at Digital Equipment Corporation.
The definition I've been working with ever since those early days is trading increased use of processing power, memory and storage to move IT functions into an artificial environment. Why would one do that? The answer is fairly clear, to obtain benefits of running those IT functions in an enhanced world that may, and often is, strikingly different that the real physical world.
Virtualization can touch how people access applications and data, how applications run, how processing ocurrs, how networks function, how storage function and is moving on to talk about how workloads are optimized in a datacenter.
The goal usually is to make the best use of available computer "power", protect critical data, and to make an environment more manageable, more reliable, more cost effective.
What is the definition you are using?