5 of 8Image
Cabinets are fitted with rear-door heat exchangers, which suck in the hot exhaust air of the servers and blast it out again as cold air. The technology - the IP is owned by IBM - can take away up to 60 percent of the heat in a full rack, the company claims.
The result of these changes is a datacentre PUE (power usage effectiveness) of 1.27, which generated a return on IBM's investment within 10 months.
Above: tidy cabling beneath the flooring of IBM's green datacentre allows air to move unimpeded.
IBM is also looking at other types of cooling: it is considering using a new type of fan developed by a partner which was developed for the military and delivers around 50 percent better performance, and around 30 percent better acoustics, according to Schmidt.
Acoustic performance is of growing importance, because of the sheer noise fans can generate - noise that could fall foul of workplace laws in Europe, for example.
More out-there technologies include the idea of dunking the IT equipment in a special fluid, Schmidt says: "There's talk in the industry of dipping the whole system in baths of dialectric fluids. We have several projects in the lab looking at them."
Getting a datacentre's infrastructure right is a huge part of the battle - but it goes hand in hand with better monitoring.
IBM developed its Measurement & Management Technologies (MMT) to better understand power usage in its datacentres. Naturally enough, it's now them to clients.
MMT comes in a manual version and a robot version - pictured above is the manually-guided sensor trolley, which gathers data from sensors placed all around the room to create a 3D heat map.
"Measure chiller power, measure cooling tower power, measure pump power, measure the IT – and then plot it," says Schmidt.