While the topic of free air cooling and the multitude of methods that are being tried to find ways to cool datacenter servers at a minimal cost in power and impact on the environment gets the lion's share of public notice, it's not the only technique that's getting examined, especially since it requires a datacenter that was designed from the ground up or completely re-engineered in order to make use of environment cooling factors.
With what could be considered the flip-side of free air cooling, completely submerging your servers in dielectric fluid to provide a more temperature friendly environment is also being tried. Cooling electronics with dielectric fluid is not a new technology, but designing server racks that take advantage of the cooling properties of this environment is a new use for this established technology. Green Revolution Cooling (GRC) is one of the leaders in the effort to bring fluid-submersion technology to the mainstream datacenter.
While the basics of liquid cooling aren't new, what GRC is bringing to the table is a cost effective solution that is self-contained and can make use of standard rack mounted servers from any OEM. Their 100 kW 42U rack mount systems, called the CarnoJet system, bear little resemblance to traditional server racks, however, and look more like the freezer cases you would have found in a 1950's supermarket, with the horizontal orientation necessary to keep the dielectric fluid covering the rack mounted equipment. And don't let the term "dielectric fluid" worry you that you will be spending money on some newfangled hi-tech liquid. It's fundamentally fragrance-free baby oil, which is probably the best known use of mineral oil.
The hook here is that standard OEM servers and blades can be deployed in this fashion, with only minimal prep work. The most obvious is the removal of any system fans, which aren't needed by the submerged system (GRC provides fan emulators for systems that check on fan status for operation). This modification is also responsible for significant power savings, with GRC claiming that this modification alone can reduce the power use of the server by as much as 20%.
The second requirement is that any hard drive in the server needs to be encapsulated, a process that GRC will perform that seals the drive and makes it air-tight. This is only necessary for rotating media, and SSDs are good to go as is. The last requirement is that the thermal grease between CPU/GPU and heat sink be removed and replaced by a piece of Indium foil. This is necessary because thermal grease is soluble in mineral oil and would, over time, dissolve. The soft foil replacement maintains the connection between processor and heat sink even when submerged. GRC technicians are able to perform this entire process in less than 15 minutes per server in most cases.
For greenfield installations, GRC has been working with OEMs to build servers that are submersion cooling ready (SuperMicro showcased their version in an announcement last week).They told me that they have equipment onsite with at least two other OEMs at this time and are expecting additional partners to come onboard.
The complete solution includes a Pump Module for every four cooled racks; this module also contains an integrated radiator which is supplied only with evaluation units; all cooling is otherwise sized to customer requirements). The solution is customized for each facility to allow waste heat to be transferred outside the datacenter facility. A full solution will also include secondary containment to prevent any coolant spills from contamination the facility, and GRC also offers decommissioning services that completely remove the coolant from a facility that has elected to no longer use this solution.