Akamai: How batteries could cut datacenter power bills

An approach being researched by the content delivery network provider helps reduce the grid-connected load during peak demand periods, when electricity is most expensive.

We've all read the various statistics about the "power-hungry" internet, which uses an estimated 1.5 percent of all the electricity generated globally.

How much power is that? According to one estimate, it's about 30 billion watts, which represents the output of 30 nuclear power plants. Or, to put it another way, it's an annual utility bill of about $8.5 billion.

Realistically, it's the prospect of mushrooming power bills that have most companies with big datacenter operations scrambling to introduce energy efficiency measures or to supplement their operations with on-site renewable energy installations.

Akamai, the big content delivery service provider, is also researching another alternative: using batteries to help reduce the amount of power that datacenters draw.

The approach being researched by Akamai fellow and University of Massachusetts professor Ramesh Sitaraman, proposes using smart batteries within internet-scale distributed networks. The batteries would automatically begin supplying power when server loads hit specified peak levels, helping reduce the need for grid-connected power during those periods, when power usually costs more. They are recharged or replenished during the night, when server loads are usually at their lowest and electricity rates are usually more cost-effective.

"Anything that reduces the peak power consumption also reduces the impact on the environment and what you need to build out a datacenter," said Sitaraman.

The batteries could be built into the servers themselves or within the rack. The research currently focuses on lead-acid technology, but it could also be applied to lithium-ion models as well, according to Sitaraman.

"We show that batteries can provide up to 14 percent power savings, that would increase to 22 percent for more power-proportional next-generation servers, and would increase even more to 35.3 percent for perfectly power-proportional servers," wrote Sitaraman in the research paper discussing his findings. "Likewise, the cost savings, inclusive of the additional battery costs, range from 13.26 percent to 33.8 percent, as servers become more power-proportional. Further, much of these savings can be achieved with a small cycle rate of one full discharge/charge cycle every three days that is conducive to satisfactory battery lifetimes."

Mind you, this is theoretical right now, but it represents another way that companies might consider reducing both peak demand, as well as the amount of power they need to source for their datacenters.