Despite the recession and the general pressure on IT budgets, the virtualisation sector has not been too badly hit. Organisations are still buying into the idea of cutting costs by reducing utility bills, improving the use of resources and freeing expensive datacentre space.
Gartner forecast earlier this year that global sales of virtualisation software would increase by 43 percent to $2.7bn (£1.7bn), with penetration levels rising from 12 percent in 2008 to 20 percent by the end of the year.
The research firm's definition of virtualisation includes infrastructure software, management tools and hosted virtual desktops, which should triple in value to $298.6m by the end of the year. Infrastructure revenues are expected to grow 22.5 percent to $1.1bn, while management tool sales will jump 42 percent to $1.3bn.
Ad hoc deployments
But as part of that shift to virtualisation, too many companies are still not planning implementations in a structured, strategic way. Instead, deployments are often ad hoc, which ends up creating problems.
Kevin Green, infrastructure solutions manager at IT services provider Trustmarque Solutions, says virtualisation is a double-edged sword. "If you get it wrong, the headaches can outweigh the benefits. You have to look at the whole infrastructure, not just one or two elements, because in a virtualised environment, everything has a knock-on effect on everything else," he says.
That interdependency means it is crucial to plan implementations carefully, not least because the upfront costs of going down this route can be high.
Here are six essential considerations that need to be weighed up before embarking on a virtualisation project.