X
Tech

The greening of IT: Cooling costs

Analysis: How to take the heat off data centre energy bills
Written by Futurity Media, Contributor

Analysis: How to take the heat off data centre energy bills

Data centres don't have to be power-hungry monsters. Futurity Media's Stewart Baines looks at ways of reducing consumption.

Data centres are the nastiest, grizzliest hoggers of electricity in the known universe. That's the popular perception. In reality they use no more electricity than older, resource-intensive industries such as ship building or chemical production.

As manufacturing declines, data centres have taken up the slack and become the forges of the information age.

With booming electricity costs and severe shortages of supply, there is a tangible commercial reason for greening the data centre - it will allow them to continue to grow.

Last year, the BroadGroup's Power & Cooling Survey found that the average UK data centre spent more than £5m per year on energy. By 2011, that figure is expected to exceed £11m.

With these increasing costs, data centres need to become more efficient. Moving forward, it is vital to review the way many are set up rather than 'nimby' them out of existence.

The power consumption of data centres already poses severe problems for the UK's local distribution networks. But their energy demands can be reduced significantly without undermining their importance to the economy.

Data centres are essential to our future computing and entertainment environment. Witness the growth of on-demand entertainment services such as IPTV, and businesses moving to a pay-as-you-grow software as a service (SaaS) model.

Do we need IPTV and SaaS? Consider the carbon footprint of burning DVDs and shipping them to an out-of-town video library. Every car journey made to the video store to rent a DVD produces more carbon than hosting the film in a data centre and streaming it on-demand.

And SaaS: how many over-powered standalone PCs are processing unnecessary applications in the background and then left on all night?

Critics say about half the energy used in an average data centre - enough to power a small town - is spent on cooling over-active processors.

A recent survey by storage vendor ONStor implies that although many data centres in Europe are facing a crisis, most IT chiefs have no plans yet to tackle it.

The survey found that nearly half of the European companies polled believe they will outgrow their existing data centre and 58 per cent had recently run out of space, power and cooling capacity "with little or no warning".

Despite these warning signs, 29 per cent of companies had no plans to green the data centre to deal with these challenges, and another 29 per cent were still talking about it.

A breath of fresh air

It's not often you can say BT displayed considerable foresight but, according to Steve O'Donnell, head of its entire global data centre estate, the BT board recognised the environmental challenges it would face and in 1996 set a 10-year goal of reducing its carbon footprint by 60 per cent by 2006.

This target has been achieved and a new goal has been set of cutting the carbon footprint by a further 50 per cent by 2016.

Granted, BT now has 104,000 staff compared with 130,000 in 1996 so its office estate is considerably smaller, and the telecoms provider is a major proponent of collaboration and conferencing tools instead of mile munching. But a significant contributor to reducing the carbon footprint has been its data centre strategy.

O'Donnell is responsible for 72 BT data centres around the world plus a dozen sites managed for customers.

"We no longer use refrigeration as our primary source of cooling in the data centres. We have a 64kW cooling unit that brings in air from outside. On the odd day it hits 40 degrees outside, we may have to turn to refrigeration slightly but it's quite uncommon," says O'Donnell.

"We save 80 per cent of normal cooling costs of data centres by not using refrigeration."

BT runs the data centres hotter than most, which means less load on the cooling. Because of its scale in purchasing, BT insists its hardware is capable at operating at up to 50 degrees.

"We also run kit off direct current (DC). There's often a lot of waste between the high voltage coming in from the street and that which powers the chips in computer," says O'Donnell.

"Usually, it comes in as AC, gets converted to DC to charge batteries, and then UPS regenerates AC, and the power supply changes it back to DC. It gets converted three times. We only convert it once which saves 35 per cent of the power stream energy."

This combination of fresh air cooling and the use of DC mean BT saves 60 per cent of the energy a conventional data centre uses.

Reduce the rack load

Another key strategy to reduce the cooling load is to scale back the server density. Many vendors are pushing high-density racks requiring 20kW and that also entails water cooling.

"We got rid of water cooling when we got rid of the mainframes. We don't want it back in the data centre," says O'Donnell.

With density at this level, it would cost more to power the server during its lifetime than it costs to buy the server itself. "A 1kW server actually is a 2.5kW server - there's the PC processing, and all the power lost in AC/DC/AC conversion and refrigeration," adds O'Donnell.

BT's server density does not exceed 3kW or 4kW per rack, a level where fresh air cooling is effective.

Ever since the deregulation of the power industry, there have been problems with investment in the power distribution network.

Organisations have one of two options: move the data centre to an area, such as Middlesbrough or Clydeside that once had a considerable manufacturing industry and utilise its existing distribution capability.

Alternatively they can create their own electricity with a combined heat-power (CHP) generator, which is what BT is doing. These generators can be quite small. A 2.5MW unit the size of two sea containers is enough to power 800 racks. Any excess power can be sold on if the unit is attached to the national grid.

CHP is more efficient than traditional power generation: 100MW of coal yields about 30MW of electricity and 70 per cent is waste. Contrast that with 100MW of gas energy put into a CHP, which yields about one-third electricity, one-third heat and one-third waste. The heat can then be used to drive the refrigeration plant.

Virtual machines

In the UK many data centres are entrenched in their existing locations with no room for their own CHP plant. Managers will need to concentrate on improving efficiency, not just the use of power for cooling but the power the processors themselves consume.

Many are looking to virtualisation for a dramatic reduction in the demands of over-powered, under-utilised servers.

Sheffield Hallam University, which has two data centres, began the virtualisation process four years ago. Now it has migrated most of its applications to VMware virtualisation. One quad-core server runs on average 18 applications. In all there are 195 virtual machines.

Servers are now utilised much more than the former 10 to 15 per cent rate, so power consumption per server is up slightly: around 1.2A for a traditional 1U server and now 1.5A to 1.7A for a virtual server. But overall, energy costs have been slashed.

According to Dave Thornley, who manages the university's data centres, it has saved about £43,000 on power bills every year. The power required to run half the data centre on physical machines has been estimated at 686,000kWh per year; the virtual infrastructure runs 195 virtual machines at only 60,500kWh.

BT is also using virtualisation - Solaris and VMware containers - and boasts of running 20 applications on a single box. It has also weeded out all the orphaned applications and servers without a corporate sponsor. "That took out about 10 per cent of our global estate," adds BT's O'Donnell.

BT has also shifted much of its storage to tape, which requires no electricity when its unused, as opposed to disk storage which always requires power. For many companies storage is a compliancy burden but unless it needs to be accessed in real-time, disk storage is a waste of money and energy.

Other techniques are emerging for reducing the energy load. Some are self-evident and low-cost such as rewiring beneath racks to allow air to circulate more freely. Others will require capital investment, such as intelligent cooling that, as it implies, only cools where processors are being used and operating near maximum.

Editorial standards