X
Innovation

Turning up the heat to cool down

In my last blog post I touched on the need to create energy-efficient data centres to ensure cloud computing lives up to its true potential. This time around I’d like to focus on a specific method for improving efficiency through high temperature ambient (HTA) data centres.
Written by Alan Priestley Cloud Builders, Member/vendor blogger (Intel)

In my last blog post I touched on the need to create energy-efficient data centres to ensure cloud computing lives up to its true potential. This time around I’d like to focus on a specific method for improving efficiency through high temperature ambient (HTA) data centres.

However, before I get rolling, I’d like to put the need for greater efficiency in context. First up, data centres are estimated to consume 1.5 percent of total world power and that consumption continues to increase. In concrete terms, that’s the equivalent of 50 power stations each year. They also generate 210 million metric tons of CO2, roughly the same as 41 million cars.

A considerable proportion of the energy consumed by data centres generates heat, which under the traditional view of facility management poses problems for the reliable running of servers. Consequently, data centres operators have cooled their facilities to between 18-21°C to try and keep the IT equipment cool. Ironically this cooling process consumes a considerable portion of the overall energy demands in a given facility.

We’re in this position for a number of reasons, including the fact that traditionally IT equipment manufacturers have specified their systems to operate at 20-25°C. One of the main reasons for this was to ensure reliable operation of the IT equipment. However operating a data centre at a higher ambient temperature, and using natural cooling facilities such as air, can result in reduced energy consumption and in turn lower annual CO2 emissions.

One commonly used metric to quantify data centre energy efficiency is The Green Grid’s Power Usage Effectiveness (PUE), the ratio of inbound power to a data centre versus the power that is actually used by the IT equipment.

A PUE of 2 for example, will mean that only 50% of the power used by a data centre is actually consumed by the IT equipment – the rest is consumed by the facilities equipment cooling the data centre. This is clearly inefficient.

Working to reduce the PUE so it is closer to 1.0 results in more of the inbound power actually being used by the IT equipment and less used in cooling the data centre. This can result in direct financial savings to the data centre operator.

There are lots of ways of lowering a data centre’s PUE but raising the ambient data centre temperature can have a striking impact. Facebook, for example, retooled its Santa Clara data centre to 27°C from the average 18-21°C. Its annual energy bill correspondingly fell by $229,000, earning it a $294,761 energy rebate.

Intel’s IT department has undertaken some work with a data centre in New Mexico to evaluate the value of high ambient temperatures and the use of natural cooling resources. The data centre had 900 production servers and 100 percent air exchange at 33°C. It delivered an estimated 67 percent power savings when compared to the average 18-21°C. This translated into approximately $2.87 million savings on the cost of power. There was no humidity control and minimal air filtration.

Another example of using HTA within a data centre is the Yahoo Computing Co-op which developed a data centre that operates no chillers and requires water for only a few days a year. Its estimated PUE is 1.08. It relies on 100 per cent natural air flow which means less than 1 percent of the building’s total energy consumption is used for cooling.

Of course, there are many elements that go into producing a truly energy-efficient data centre, which in turn delivers true cloud-computing benefits. These range from increasingly powerful but more energy-efficient processors to server platform innovation. But the fact can’t be escaped that by raising ambient temperatures, and using natural cooling resources, the energy-efficiency of data centres can be upped dramatically.

Editorial standards