Grid computing is much more widespread in businesses than first thought, new research has found, but arguments over who should pay for it could well be stymieing growth in adoption.
Analyst house Quocirca's latest Insight report into grid computing has found that grid-related technologies, such as virtualisation, are acting as the thin end of the wedge in introducing enterprises to the benefits of grid.
"Grid computing still has to move from idea to action for most organisations, but such a move is very likely as the momentum behind virtualisation will drive this natural evolution," the report says.
North America is the farthest ahead globally in terms of adoption, with Scandinavia beating the rest of Europe to rollouts. However, the problems dogging the rise of grid look to be the standard woes hampering emerging technology.
The report says: "IT professionals still highlight limited familiarity and skills availability as two of the main constraining factors on grid computing adoption, along with the need for solutions to mature further and for standards to firm up."
However, one of the main drivers behind grid — sharing resources across an enterprise — could also prove to be its undoing, the report suggests, as various departments will seek to shun the responsibility of footing the bill for the technology.
"The whole point of a grid computing environment is to achieve the smooth and dynamic sharing of resources between different applications and different areas of the business in response to fluctuating demand. Unfortunately, the consequence is that it becomes more difficult to determine who owns or should pay for a given component or operational framework, which is an issue when budgets are aligned to business units or departments."