Submerge your datacenter

Submerge your datacenter

Summary: Fluid-submersion server cooling systems take a different approach to datacenter energy efficiency

SHARE:

While the topic of free air cooling and the multitude of methods that are being tried to find ways to cool datacenter servers at a minimal cost in power and impact on the environment gets the lion's share of public notice, it's not the only technique that's getting examined, especially since it requires a datacenter that was designed from the ground up or completely re-engineered in order to make use of environment cooling factors.

With what could be considered the flip-side of free air cooling, completely submerging your servers in dielectric fluid to provide a more temperature friendly environment is also being tried. Cooling electronics with dielectric fluid is not a new technology, but designing server racks that take advantage of the cooling properties of this environment is a new use for this established technology. Green Revolution Cooling (GRC) is one of the leaders in the effort to bring fluid-submersion technology to the mainstream datacenter.

While the basics of liquid cooling aren't new, what GRC is bringing to the table is a cost effective solution that is self-contained and can make use of standard rack mounted servers from any OEM. Their 100 kW 42U rack mount systems, called the CarnoJet system,  bear little resemblance to traditional server racks, however, and look more like the freezer cases you would have found in a 1950's supermarket, with the horizontal orientation necessary to keep the dielectric fluid covering the rack mounted equipment. And don't let the term "dielectric fluid" worry you that you will be spending money on some newfangled hi-tech liquid. It's fundamentally fragrance-free baby oil, which is probably the best known use of mineral oil.

The hook here is that standard OEM servers and blades can be deployed in this fashion, with only minimal prep work.  The most obvious is the removal of any system fans, which aren't needed by the submerged system (GRC provides fan emulators for systems that check on fan status for operation). This modification is also responsible for significant power savings, with GRC claiming that this modification alone can reduce the power use of the server by as much as 20%.

The second requirement is that any hard drive in the server needs to be encapsulated, a process that GRC will perform that seals the drive and makes it air-tight. This is only necessary for rotating media, and SSDs are good to go as is. The last requirement is that the thermal grease between CPU/GPU and heat sink be removed and replaced by a piece of Indium foil. This is necessary because thermal grease is soluble in mineral oil and would, over time, dissolve. The soft foil replacement maintains the connection between processor and heat sink even when submerged.  GRC technicians are able to perform this entire process in less than 15 minutes per server in most cases.

For greenfield installations, GRC has been working with OEMs to build servers that are submersion cooling ready (SuperMicro showcased their version in an announcement last week).They told me that they have equipment onsite with at least two other OEMs at this time and are expecting additional partners to come onboard.

The complete solution includes a Pump Module for every four cooled racks; this module also contains an integrated radiator which is supplied only with evaluation units; all cooling is otherwise sized to customer requirements). The solution is customized for each facility to allow waste heat to be transferred outside the datacenter facility. A full solution will also include secondary containment to prevent any coolant spills from contamination the facility, and GRC also offers decommissioning services that completely remove the coolant from a facility that has elected to no longer use this solution.

Topics: CXO, Data Centers, Hardware, Servers, Storage

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

18 comments
Log in or register to join the discussion
  • A great implementation and...

    Quite compatible with well-oiled salespersons.
    premiertechnologist
    • I see what you did there. [nt]

      [b]
      Champ_Kind
  • One advantage not mentioned...

    A significant reduction of noise in the equipment room.

    Some of those Blade installations have really noisey fans and the noise polution is quite annoying.

    I would expect that these units will run quietly.
    premiertechnologist
    • Amen

      our server room sounds like a jump-jet testing facility
      Imrhien
  • "waste" heat

    Seems to me there should be no "waste" here. If the building were located somewhere requiring year round heating of the facility, the "waste" heat could be tuned well enough to be the sole source of heating it would seem. Perhaps a heat exchanger feeding an under-floor heating solution.

    There's also the simple matter of hot water. Surely a data center's entire hot water needs could be met using that "waste" heat.

    If I were an architect I'd be penciling out where and how to build a system with no "waste." I'm imagining a center built in conjunction with hydroelectric power generation in northern Canada or Greenland. Greenland would seem ideal, being near the geographical center of the populations using the majority of data center resources.
    pgit
    • There is PLENTY of Waste Heat...

      Server buildings product a LOT of heat. It is nearly all waste heat. Even with air to air cooling (conventional HVAC) it is possible to have a water loop for hot water, but frankly, server facilities don't have much need for hot water as they have very little use for it. Hand washing is about it, and since there are very few people working in a facility, there isn't much need for hand washing.

      Also, while submerging servers in mineral oil, or even a freon liquid is MUCH more efficient on that end, the heat still needs to go somewhere, and that is outside.
      Fetrow
      • There is PLENTY of waste heat.

        All it takes is thinking [b]outside the box[/b] for 'waste heat' solution.

        One example:

        http://www.datacenterknowledge.com/archives/2010/02/03/data-centers-heat-offices-greenhouses-pools/
        fatman65535
  • Liquid cooled

    Fantastic idea however this increases the required rack footprint having the racks horizontally and not vertically, if there was a vertical solution this would be far more cost effective.
    PCalvert
    • Less an issue with multiples

      The footprint issue is less pronounced with multiples than with single units.
      cuhulin1
  • How do you swap a board?

    If a fault occurs on a board/blade that is immersed in baby oil, how does one swap it out?
    peterm@...
    • with a towel?

      :)
      pgit
  • not so new...

    http://www.youtube.com/watch?feature=player_embedded&v=PtufuXLvOok
    daddmac
    • Older still... "Bubbles"

      The Cray 2 system introduced in 1985 was submerged in liquid.
      Fluorinert, a new inert liquid from 3M.

      Due to the use of liquid cooling, the Cray-2 was given the nickname "Bubbles"

      p.s. as the article said "While the basics of liquid cooling aren???t new, what GRC is bringing to the table is a cost effective solution..."
      Cmd_Line_Dino
      • those were mesmerizing

        I remember standing in front of one of those as a child, watching the liquid filled innards and trying to imagine what I'd just been told: that if the liquid wasn't there, the circuitry would melt in a fraction of a second. It was also the first time I'd seen such a thing, it baffled me since I didn't know how mineral oil was different from water and I knew that water was deadly for electronics.
        Htalk
  • More possibilities...

    If wide scale acceptance comes sooner, I'm hoping that the significant savings would be passed on to consumers. Cooling a data center takes up enormous amount of energy, and also requires significant amount of backup resources that not only have to power the servers, but also the cooling equipment. If all that can be eliminated, then the recently announced plans of hosting data centers in international waters as well as on stationary airborne blimps may come to fruition sooner.

    As with anything, I would like to see a more in-depth research on long-term consequences of this.

    Ray Chinoy
    SEO Strategist
    http://www.Level9Solutions.com
    Level9Solutions
    • You know as well as everyone else

      that corporations are only going to use their savings to pad their bottom line. Maybe some dividends, maybe some stock buybacks, maybe pay increases for the execs, but that's about it.
      Champ_Kind
  • data center looking like pools

    If data centers go the route of submerged cooling solution, may be in the future data center techs will be wearing scuba gears to work :) imagine a data center looking like a big swimming pool with "island racks"
    setske
  • It's about time......

    While air cooling *works*, it's horribly inefficient. Air is a horrible conductor of heat. The insulating ability of air is well known and well-exploited by both humans and nature for insulation, whether as wool, Fiberglas, Styrofoam or fur, all of which insulate by trapping air in some manner. Ever reach into a 450-degree oven? The hot air does not burn your arm in the time it takes to slide out the pizza, even if done slowly, whereas 200-degree water (or oil) will remove skin in an instant. It's about time someone applied the old technology of liquid cooling to the new technology of massive server farms....Whether overall efficiency is significantly increased is debatable though, since the same amount of server heat must be dissipated. If it helps even a couple of percent though, it is still a massive energy savings and you can feel good about having a lower personal carbon footprint since the servers storing your 5-year-old emails are running more efficiently.....
    Steve I.