/>
X

Cool runnings: IBM's recipe for a happy datacentre, in pictures

How do you make your datacentre run better, and save money? At IBM's research labs and facilities in upstate New York, three ideas are uppermost on people's minds: energy efficiency, monitoring, and utilisation.
|
jon-yeomans.jpg
|
Topic: Cloud
ibm-green-datacentre.jpg
1 of 8 Jon Yeomans/ZDNet

Save energy wherever you can, and use your computer better: those are the mantras at IBM's research labs in Poughkeepsie, New York.

These tenets apply both to the equipment IBM is putting into datacentres, and to the infrastructure surrounding them. The result, the company hopes, is a datacentre that saves energy and money - and it is exploring seemingly every feasible avenue to achieve its aims.

"If you look at a pie chart on how much energy is used in a datacentre, roughly half the pie goes to infrastructure, and half goes to the IT," says Roger Schmidt, IBM fellow and chief engineer for datacentre energy efficiency. "That's not good. The work is done by the IT, right? So you want to make the infrastructure half of the pie as small as possible. We're trying to tie the equipment more tightly to the environment so we can save energy."

For IBM, this means a focus on two areas: air-cooling and water-cooling. These are nowhere more apparent than in the company's 'green datacentre' at Poughkeepsie (above), which has an air-cooled side and a water-cooled side.

The air-cooled side of the room relies on a principle as simple as opening a window. "It's cold outside," Schmidt says, pointing to the minor blizzard hitting the vast plain of the IBM car park. "Just open these windows. Why can't we do that?" In fact, the industry is doing just that, he says: using ambient air to cool datacentre equipment. The upside is that air-conditioning can be turned off, saving power and emissions.

ibm-datacentre-temp-v1.jpg
2 of 8 Jon Yeomans/ZDNet

Of course, not every datacentre can be built in chilly climes. "The latency is the problem; the delay. A lot of companies have to have datacentres located near their work, and in the country", Schmidt says. Some places, for example, mandate that data cannot leave the country for data protection reasons.

But even if a business is located in a warmer climate, its datacentre can still tap into its surroundings, partly because the industry is moving away from the notion that datacentres have to stay nicely chilled. IBM's chips, Schmidt says, are designed work in temperatures of up to 85°C (185°F). Keep the equipment below that, and it should work just fine.

It follows, therefore, that you can use outside water at ambient temperature to chill your datacentre - creating the seemingly paradoxical notion of 'hot-water cooling'. "Like the air cooling... why can't we use the outside water to cool?" Schmidt says. The advantage, of course, is that if you can use water at ambient temperature, you're not using energy to chill it first.

Some of the company's x-series mainframes have inbuilt water-cooling with a water temperature of 45°C and IBM is looking to push the water temperature for cooling even higher, to around 65°C: "What we've been talking about is using the water to cool the chips, then it goes through the system and comes out hotter, maybe 70°C, and we use that hotter water for the heating of buildings and central distribution in towns," Schmidt says. "There's a lot of interest in this because we've never really been able to solve the waste heat problem for datacentres. So now we use hotter water, we can use that waste heat for good purposes."

The principle is illustrated above: the dial on the left shows the temperature of the water coming in from the Hudson River, just outside the IBM plant. The right-hand dial shows the water temperature on the way out, after it has collected heat from the datacentre.

ibm-datacentre-water-cooling.jpg
3 of 8 Jon Yeomans/ZDNet

Pictured above: water-cooling at IBM's Poughkeepsie datacentre.

The green datacentre is a working facility, processing jobs for IBM customers and Big Blue itself.

Three years ago, the facility was running out of space and suffered cooling problems. Rather than build a new centre, IBM called in a team of designers to remodel the space, and in the process created a showroom highlighting datacentre best practice.

Some of the measures it took are startlingly low-fi: brushes around pipes, for example, stop air escaping; bendy underfloor pipes mean the water-cooling configuration can be changed at any time; and the air vents in the floor are different sizes for the air-cooled side and the water-cooled side, according to which needs more ventilation.

In the back room, there's a sub-station that allows DC electricity to go straight into the equipment without conversion. "There's a lot of discussion in industry: why do we need all these conversion paths all the way through: DC to AC, AC to DC?" Schmidt says. Each conversion loses energy along the way, so plugging DC straight into the datacentre avoids that leakage. "We've now certified some of our high-end systems [for DC power]."

ibm-green-datacentre-rear-door-filter.jpg
4 of 8 Jon Yeomans/ZDNet

Cabinets are fitted with rear-door heat exchangers, which suck in the hot exhaust air of the servers and blast it out again as cold air. The technology - the IP is owned by IBM - can take away up to 60 percent of the heat in a full rack, the company claims.

The result of these changes is a datacentre PUE (power usage effectiveness) of 1.27, which generated a return on IBM's investment within 10 months.

ibm-datacentre-cabling.jpg
5 of 8 Jon Yeomans/ZDNet

Above: tidy cabling beneath the flooring of IBM's green datacentre allows air to move unimpeded.

IBM is also looking at other types of cooling: it is considering using a new type of fan developed by a partner which was developed for the military and delivers around 50 percent better performance, and around 30 percent better acoustics, according to Schmidt.

Acoustic performance is of growing importance, because of the sheer noise fans can generate - noise that could fall foul of workplace laws in Europe, for example.

More out-there technologies include the idea of dunking the IT equipment in a special fluid, Schmidt says: "There's talk in the industry of dipping the whole system in baths of dialectric fluids. We have several projects in the lab looking at them."

ibm-mmt-station.jpg
6 of 8 Jon Yeomans/ZDNet

Getting a datacentre's infrastructure right is a huge part of the battle - but it goes hand in hand with better monitoring.

IBM developed its Measurement & Management Technologies (MMT) to better understand power usage in its datacentres. Naturally enough, it's now them to clients.

MMT comes in a manual version and a robot version - pictured above is the manually-guided sensor trolley, which gathers data from sensors placed all around the room to create a 3D heat map.

"Measure chiller power, measure cooling tower power, measure pump power, measure the IT – and then plot it," says Schmidt.

ibm-datacentre-monitoring.jpg
7 of 8 Jon Yeomans/ZDNet

In the corner, a bank of monitors displays an array of data gleaned from the room's sensors and run through IBM's Tivoli software. 

Monitoring can save around 20 percent of energy in the datacentre, Schmidt says.

ibm-datacentre-racks.jpg
8 of 8 Jon Yeomans/ZDNet

But if advances in cooling tech and better monitoring deliver a more efficient datacentre, there's a catch: usage just keeps going up and up. "Even though the performance-per-watt has improved significantly, clients buy more and more of the equipment. They can't get enough of it," Schmidt says (our obsession with storing everything - from work documents to family photos and videos - is only adding to the load).

To keep pace with demand, utilisation of equipment has to be better, Schmidt argues, citing the example of a humble laptop: Task Manager will tell you how much of your system is being utilised - and it's usually just a couple of percent. "But the power is max power. It doesn't make sense," Schmidt says. "So there's a lot of effort on ramping the power with the performance now. If you are using that performance, and you don't need the power, start turning stuff off, or put it in sleep mode."

In terms of datacentres, the aim is to get utilisation up to 50 percent.

And while energy efficiency, monitoring and utilisation remain IBM's prescription for a healthy datacentre,  Schmidt will continue looking to improve IT cooling technology. IBM's roots with water cooling go all the way back to 1964, and the company is still sticking with it: "I don't see us deviating from that, but we always look at other technologies, in case we're missing something," he says.

Related Galleries

Linux turns 30: The biggest events in its history so far
05-debian.jpg

Related Galleries

Linux turns 30: The biggest events in its history so far

Chromebook shipments, cloud spending, data quality concerns, and more: Tech research roundup
Millennial casual businessman thinking and looking at laptop in office

Related Galleries

Chromebook shipments, cloud spending, data quality concerns, and more: Tech research roundup

Azure Synapse Analytics data lake features: up close
azure-ml-integration-via-notebook.png

Related Galleries

Azure Synapse Analytics data lake features: up close

Top programming languages, 5G worries, cloud computing, and more: Research round-up
gsa-status-of-5g-by-country-end-2019.jpg

Related Galleries

Top programming languages, 5G worries, cloud computing, and more: Research round-up

IT spending, cloud computing, big data, virtual reality, and more: Research round-up
racial-and-ethnic-composition-by-religious-group.png

Related Galleries

IT spending, cloud computing, big data, virtual reality, and more: Research round-up

Provision a WordPress site in 30 minutes or less using Amazon AWS Lightsail
01-lightsail.jpg

Related Galleries

Provision a WordPress site in 30 minutes or less using Amazon AWS Lightsail

Julia programming language, cloud computing, cybersecurity worries: Research round-up
industrycloud-infographic.jpg

Related Galleries

Julia programming language, cloud computing, cybersecurity worries: Research round-up