This post is part of a series of myths I've heard mentioned during discussion with Kusnetzky Group clients. This time, the topic is expertise requirements. Often suppliers of virtualization technology in general and virtual machine software in specific leave the impression that their technology can be just dropped into the environment and used without requiring any special expertise on the part of the organization.
As with any technology, virtualization technology often can be used at a rudimentary level with little or know knowledge. Any production use of the technology, on the other hand, requires knowledge and planning (I focused on planning in an earlier post). Let's examine the needs of one type of virtualization technology.
If an organization is planning to deploy vitual machines, they need to have expertise on all of the following topics.
- Installing and configuring the hypervisor they've selected (VMware, Xen or KVM) to make optimal use of the available resources.
- Creating virtual machines so that they have access to the appropriate physical resources including enough memory and storage for the task they're supposed to accomplish
- Provisioning those virtual machines with the appropriate operating system, data management software, application frameworks and applications
- If multiple operating systems are being deployed, expertise is needed for all of them.
- If multiple data management engines are being deployed, expertise is needed for all of them.
- If multiple application frameworks are being deployed, expertise is needed for all of them
After considering the above, most of my clients take some additional time listing all of the types and levels of expertise that will really be required. They make sure that they have access to that expertise prior to embarking on the virtualization journey for the first time.
When your organization deployed virtual machine technology, were there any surprises along the way? Did the solution require outside consultants?