Cloud computing could cost more than using your own systems

My ZDnet colleague Joe McKendrick recently posted Why cloud computing may cost more than on-premise systems in which he lays out some of the reasons why a cloud computing solution just might be more costly than a solution hosted in an organization's own datacenter. As with any technology or approach to computing, cost savings are never automatic. They must be won with good ideas, good planning and good execution.
I believe that part of an organization's blindness to these requirements is that some executives believe that the major costs in an IT solution are the systems and the software that runs these systems. Depending upon the circumstances and the architecture of the IT solution, other costs may simply dwarf the costs of the systems and the software.
In other words, a solution may actually cost more even if the hardware and software were given to the organization free of any charge if the staff-related costs of that approach to creating an IT solution involved more people, more types of expertise, more operational steps to achieve the goals and the like. In the many cost of ownership and return on investment studies my team at IDC or other teams at IDC conducted during my time with that company, another view came to the forefront.
Staff related costs often were 50% to 70% of the total cost over a period of three years. Cost of communications, power, cooling and facilities could add up to another 30% to 40% of the total. Hardware and software, when combined, usually represented somewhere between 20% and 25% of the costs.
Surprised to find that people are the most costly component of an IT solution? This is why organizations choose to outsource development, operations, helpdesk and the like to countries having the lowest staff costs. The savings here can drop right to the bottom line.
This is also why organizations choose physical locations for their datacenter that offer low real estate, power and cooling costs.