(Post updated Monday, Aug. 1, to include link to Koomey's new research.)
It may be time to revise the figures we keep tossing around about the huge appetite of data centers when it comes to electricity. An article in The New York Times based on research by Stanford University professor Jonathan Koomey suggests that the global recession, combined with the fast emergence of server virtualization technologies and more power-efficient microprocessors, have helped curb the demand for electricity.
That's not to say that data center power consumption isn't growing. According to The New York Times article, Koomey's study of data center power consumption from 2005 to 2010 found that electricity usage increased about 56 percent during the time period. In the United States, the growth rate was 36 percent.
The point is that both of those rate are far slower than the figures predicted by the U.S. Environmental Protection Agency back in 2007. That report sounded the alarm for a doubling of electricity consumption related to data centers. The EPA figured it would take 100 billion kilowatt-hours of electricity at an annual cost of $10 billion to power all those email servers, Internet service providers and data centers during the five-year period in question.
The Times quotes Professor Koomey:
"Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity is clearly much lower than what was expected, and that's really the big story."
I personally welcome this revelation, and plan to spend some more time with the report, which you can find here in detail.
Certainly, the slower economy was a factor in slowly electricity consumption group, although we don't really know how much. But the fact that power consumption was slower than expected is a great tribute to all the work on energy-efficiency that has been going on across the IT industry. Now, if we could just figure out how to make renewable energy a bigger part of the data center equation.