Cool runnings: IBM's recipe for a happy datacentre, in pictures

Cool runnings: IBM's recipe for a happy datacentre, in pictures

Summary: How do you make your datacentre run better, and save money? At IBM's research labs and facilities in upstate New York, three ideas are uppermost on people's minds: energy efficiency, monitoring, and utilisation.

SHARE:

 |  Image 1 of 8

  • Save energy wherever you can, and use your computer better: those are the mantras at IBM's research labs in Poughkeepsie, New York.

    These tenets apply both to the equipment IBM is putting into datacentres, and to the infrastructure surrounding them. The result, the company hopes, is a datacentre that saves energy and money - and it is exploring seemingly every feasible avenue to achieve its aims.

    "If you look at a pie chart on how much energy is used in a datacentre, roughly half the pie goes to infrastructure, and half goes to the IT," says Roger Schmidt, IBM fellow and chief engineer for datacentre energy efficiency. "That's not good. The work is done by the IT, right? So you want to make the infrastructure half of the pie as small as possible. We're trying to tie the equipment more tightly to the environment so we can save energy."

    For IBM, this means a focus on two areas: air-cooling and water-cooling. These are nowhere more apparent than in the company's 'green datacentre' at Poughkeepsie (above), which has an air-cooled side and a water-cooled side.

    The air-cooled side of the room relies on a principle as simple as opening a window. "It's cold outside," Schmidt says, pointing to the minor blizzard hitting the vast plain of the IBM car park. "Just open these windows. Why can't we do that?" In fact, the industry is doing just that, he says: using ambient air to cool datacentre equipment. The upside is that air-conditioning can be turned off, saving power and emissions.

  • Of course, not every datacentre can be built in chilly climes. "The latency is the problem; the delay. A lot of companies have to have datacentres located near their work, and in the country", Schmidt says. Some places, for example, mandate that data cannot leave the country for data protection reasons.

    But even if a business is located in a warmer climate, its datacentre can still tap into its surroundings, partly because the industry is moving away from the notion that datacentres have to stay nicely chilled. IBM's chips, Schmidt says, are designed work in temperatures of up to 85°C (185°F). Keep the equipment below that, and it should work just fine.

    It follows, therefore, that you can use outside water at ambient temperature to chill your datacentre - creating the seemingly paradoxical notion of 'hot-water cooling'. "Like the air cooling... why can't we use the outside water to cool?" Schmidt says. The advantage, of course, is that if you can use water at ambient temperature, you're not using energy to chill it first.

    Some of the company's x-series mainframes have inbuilt water-cooling with a water temperature of 45°C and IBM is looking to push the water temperature for cooling even higher, to around 65°C: "What we've been talking about is using the water to cool the chips, then it goes through the system and comes out hotter, maybe 70°C, and we use that hotter water for the heating of buildings and central distribution in towns," Schmidt says. "There's a lot of interest in this because we've never really been able to solve the waste heat problem for datacentres. So now we use hotter water, we can use that waste heat for good purposes."

    The principle is illustrated above: the dial on the left shows the temperature of the water coming in from the Hudson River, just outside the IBM plant. The right-hand dial shows the water temperature on the way out, after it has collected heat from the datacentre.

  • Thumbnail 1
  • Thumbnail 2
  • Thumbnail 3
  • Thumbnail 4
  • Thumbnail 5
  • Thumbnail 6
  • Thumbnail 7
  • Thumbnail 8

Topics: Cloud, Data Centers, Datacentre Tour, Emerging Tech, IBM, Servers

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • IBM's racks suck!

    Take a look at the cable management. It is Horrible. We had a project where IBM brought in their own RACK as part of a solution and had to live with it for 5 years until we finally got rid of that garbage. There is virtually none or poor cable management. We standardized on APC Wide and Deep racks with proper cable management and its far superior. They really need to do something about this if they want to provide complete solutions that work well in a professional server environment. I'd send back any solution that strings cables everywhere like that. Just my 2 cents.
    Drewidian
  • half?

    "So you want to make the infrastructure half of the pie as small as possible."

    I think I see your problem.
    RedEmma
  • Blue glow

    What are those racks with the blue glow in between the Netezza and the System z? Power 795?
    AdamS12