Cool runnings: IBM's recipe for a happy datacentre, in pictures

Cool runnings: IBM's recipe for a happy datacentre, in pictures

Summary: How do you make your datacentre run better, and save money? At IBM's research labs and facilities in upstate New York, three ideas are uppermost on people's minds: energy efficiency, monitoring, and utilisation.


 |  Image 4 of 8

  • Thumbnail 1
  • Thumbnail 2
  • Thumbnail 3
  • Thumbnail 4
  • Thumbnail 5
  • Thumbnail 6
  • Thumbnail 7
  • Thumbnail 8
  • Pictured above: water-cooling at IBM's Poughkeepsie datacentre.

    The green datacentre is a working facility, processing jobs for IBM customers and Big Blue itself.

    Three years ago, the facility was running out of space and suffered cooling problems. Rather than build a new centre, IBM called in a team of designers to remodel the space, and in the process created a showroom highlighting datacentre best practice.

    Some of the measures it took are startlingly low-fi: brushes around pipes, for example, stop air escaping; bendy underfloor pipes mean the water-cooling configuration can be changed at any time; and the air vents in the floor are different sizes for the air-cooled side and the water-cooled side, according to which needs more ventilation.

    In the back room, there's a sub-station that allows DC electricity to go straight into the equipment without conversion. "There's a lot of discussion in industry: why do we need all these conversion paths all the way through: DC to AC, AC to DC?" Schmidt says. Each conversion loses energy along the way, so plugging DC straight into the datacentre avoids that leakage. "We've now certified some of our high-end systems [for DC power]."

  • Cabinets are fitted with rear-door heat exchangers, which suck in the hot exhaust air of the servers and blast it out again as cold air. The technology - the IP is owned by IBM - can take away up to 60 percent of the heat in a full rack, the company claims.

    The result of these changes is a datacentre PUE (power usage effectiveness) of 1.27, which generated a return on IBM's investment within 10 months.

  • Above: tidy cabling beneath the flooring of IBM's green datacentre allows air to move unimpeded.

    IBM is also looking at other types of cooling: it is considering using a new type of fan developed by a partner which was developed for the military and delivers around 50 percent better performance, and around 30 percent better acoustics, according to Schmidt.

    Acoustic performance is of growing importance, because of the sheer noise fans can generate - noise that could fall foul of workplace laws in Europe, for example.

    More out-there technologies include the idea of dunking the IT equipment in a special fluid, Schmidt says: "There's talk in the industry of dipping the whole system in baths of dialectric fluids. We have several projects in the lab looking at them."

Topics: Cloud, Data Centers, Datacentre Tour, Emerging Tech, IBM, Servers

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories


Log in or register to join the discussion
  • IBM's racks suck!

    Take a look at the cable management. It is Horrible. We had a project where IBM brought in their own RACK as part of a solution and had to live with it for 5 years until we finally got rid of that garbage. There is virtually none or poor cable management. We standardized on APC Wide and Deep racks with proper cable management and its far superior. They really need to do something about this if they want to provide complete solutions that work well in a professional server environment. I'd send back any solution that strings cables everywhere like that. Just my 2 cents.
  • half?

    "So you want to make the infrastructure half of the pie as small as possible."

    I think I see your problem.
  • Blue glow

    What are those racks with the blue glow in between the Netezza and the System z? Power 795?