I read an insightful post The case for chargeback and virtual appliance that was posted by Alex Barrett yesterday. He brings up several interesting points about accounting for virtual environments. Thanks, Alex, for calling our attention to this issue.
As with other posts I've seen, this one seems to equate virtualization with the use of virtual machine software. While the points presented are worth considering, it would be wise for IT decision-makers to take a broader and deeper view of the whole concept of virtual environments. Unfortunately, those who take that view are likely to see an even bigger problem than the good Mr. Barrett examines.
Rather than just lumping all IT-related costs together and accounting for them as part of the IT budget, many organizations use the instrumentation that is found in their operating environments, data management tools, application frameworks and, in some rare examples, the applications themselves in order to construct a way to pass on the costs of running a IT-based solution to the business units or departments that use that solution.
These organizations have found that it is very difficult to build a valid chargeback system for today's multi-platform, multi-tier, Web-based or service oriented architecture solutions. Adding virtualization in the form of virtual machine software or operating system virtualization/partitioning makes the challenge even greater than before. Those who consider:
- access virtualization
- application virtualization
- all of processing virtualization
- storage virtualization
- network virtualization
- security services for all of those layers
find accounting for solution usage so that costs can be properly passed on to the appropriate business units or departments to be more than a bit daunting.
The use of virtualization, of course, didn't create this problem, it has just magnified and intensified a problem that already existed.
How does your organization account for system usage? Is it done the same way for virtualized environments?