How Facebook ended up with baked potato inside its servers

How Facebook ended up with baked potato inside its servers

Summary: Facebook's open hardware chief on how attempts to design the world's most efficient datacentre led to the smell of fries and lots of hungry engineers.

SHARE:
5

The smell of chips cooking in the datacentre is generally never good news.

But the unmistakable odour being given off from servers being tested by Facebook earlier this year was more French Fries than modern microprocessor.

How did Facebook end up baking potatoes inside servers? As Facebook's VP of hardware design and supply chain operations Frank Frankovsky explained, it stemmed from Facebook's experiments under the Open Compute Project (OCP).

Under the OCP, Facebook and its partners have committed to developing novel designs for compute, storage and general data center infrastructure — not just the servers themselves, but the chassis and racks they sit in and their associated power and cooling, and then to sharing those designs so they can be refined and built upon.

Frankovksy and his OCP partners had been looking for a way to reduce the amount of waste material generated by a server, which had led them to remove the server's lid. The problem was that a lidless server didn't direct enough air over the top of the CPUs for cooling. Not being a big fan of adding environmentally-unfriendly components they hit on the idea of using the material used to make Spudware, kitchen utensils made out of 80 percent potato starch. Unfortunately there was a downside.

"We created a thermal lid out of that starchy material and found out pretty quickly that when you heat that up it smells a lot like French Fries, so people in the datacentre were getting pretty hungry. It also gets a little floppy and gloopy," he said at a briefing in London today.

It's not the first time that Facebook's OCP experiments resulted in some pretty unorthodox outcomes.

Earlier this year Facebook talked about how an actual cloud had formed inside its datacentre in Prineville, Oregon as a result of water condensing out of air that had passed through its fresh air cooling system, leading to power supply failures and servers shutting down and rebooting .

The upshot is that the spec for OCP power supplies have a special coating to prevent condensation from forming.

"We learned a lot from that and we applied some conformal coating. Now this second condensation event occurred and we had nothing fail," said Frankovsky.

 

Pushing the limits of the datacentre

In general, Frankovsky said that datacentre operators are generally reluctant to deviate from tried and tested ways of building and running servers, something which costs them in terms of efficiency.

"What I would say to those operators is 'Start pushing that envelope a little bit harder'," he said, adding that computer hardware can survive in far more challenging conditions than is generally accepted.

"Computing is fairly resistant to heat, humidity and even condensation, believe it or not."

Facebook runs its datacentres without computer room air conditioning, uses 100 percent outside air for cooling, removes the room-wide Uninterruptible Power Supply and delivers "higher voltage AC power directly to the server".

As a result Frankovsky said Facebook's datacentres achieve 1.07 power usage effectiveness (PUE) rating, far better than what he called the "gold standard" for datacentres of 1.5PUE.

PUE measures the ratio of the total power delivered to a facility to how much gets to a server – so a datacentre with a PUE of 1.5 needs to draw down 1.5W of power to get 1W to a server.

"There are very few areas in the world that are so hot and humid that you can't get the inlet temperatures to a point where the electronics would survive," he said about the decision to remove air conditioning.

"But even if air conditioning isn't a risk they [datacentre operators] are willing to take, there is a lot that can be done in electrical efficiency.

"Eliminate the room-wide UPS, a room-wide UPS costs about $2 per watt, the Open Compute battery cabinets that we've also open-sourced, those are about 25 cents per watt.

"So not only could they save a bunch of money from a capex perspective, while delivering the same amount of backup power functionality, but it's also far more efficient because they're not transferring the AC to DC conversions, so they're not losing the power."

Facebook and its OCP partners even go as far as incorporating the logistics of how equipment is transported to the datacentre into their designs.

Talking about the designs used for equipment being sent to Facebook's Lulea datacentre in Sweden, which uses entirely OCP-designed infrastructure, Frankovsky said: "We designed the rack enclosure, as well as the palettes that it transported on to be able to plug a truck 100 percent. We want to make sure that every one of those trucks is absolutely plugged with equipment, so we don't have any wasted transportation costs."

Further reading about the Open Compute Project

Topics: Hardware, Data Centers

About

Nick Heath is chief reporter for TechRepublic UK. He writes about the technology that IT-decision makers need to know about, and the latest happenings in the European tech scene.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

5 comments
Log in or register to join the discussion
  • I still like the Iceland idea ...

    Put the servers in Iceland and just open the windows. (No, it's not a joke.)

    Lots of cool air and a country that is looking for international investments.

    (And having a lot of hot blondes is a fringe benefit that only marginally affects system performance, so I'm told.)
    Rick_R
  • Utilizing wasted heat

    In the days of CRT-type monitors and always on the lookout for a way to save energy, I would place my lunch on the back of the chassis and by noon it was nice and toasty. Kind of crazy with saving resources: To save not having to use a little plastic coffee stirrer, I put the cream in the bottom of the cup and then add the coffee. The turbulence is sufficient to mix it up properly.
    tlmurray
  • stories like this

    are why I don't recycle..these freaks just annoy me.
    TrishaDishaWarEagle
  • novec 7100

    I wonder if the Facebook lab have looked at Total immersive cooling using Novec 7100 or similar. Then they could do away with all kinds of lids and casings.
    Mytheroo
  • Of course when your product is Facebook...

    ...and your data can't really be considered mission-critical, it's easy to take more liberties with your datacentre infrastructure.
    Playdrv4me