X
Business

How Open Compute has shaped Facebook's Forest City data center

The North Carolina data center is built upon the bare bones, power-optimized premise of the Open Compute Project, with design efficiencies that Facebook says saved $1.2 billion in infrastructure costs.
Written by Natalie Gagliordi, Contributor
fbforestcity

FOREST CITY, NC — Tucked away in the foothills of North Carolina's ancient South Mountains is the gleaming campus of Facebook's Forest City data center.

Gallery: 10 things to learn from Facebook's approach to data centers

In operation since 2012, the rural 160-acre campus is one of Facebook's fleet located across the country, joining the flagship US location in Prineville, Ore., and an upcoming location in Iowa.

On a recent tour of the facility, guided by data center site manager Keven McCammon, it became evident that Facebook doesn't want to be part of the green Internet initiative, it wants to lead it, and that means abandoning the cutthroat "Fight Club" mentality of yesteryear and embracing the power of community-driven innovation.

The Forest City data center is built upon the bare bones, power-optimized premise of the Open Compute Project (OCP) — and nearly every aspect of its functionality is touched by that design.

OCP launched in 2011 with the goal of developing servers and data centers with a model traditionally associated with open source software projects. In the three years since, it seems the social media giant has indeed learned a thing or two from the communal think-tank.

Thanks to design efficiencies gleaned from OCP, Facebook says it saved $1.2 billion in infrastructure costs, enough energy to power 40,000 homes for a year and the carbon equivalent of taking 50,000 vehicles off the road. 

So what exactly are those design efficiencies? Well, for starters, the Forest City center runs on 100-percent outdoor air, saving a boatload on heating and cooling costs from power-hungry air handlers.

In the two-hour tour, nearly half the time was spent going through corridor after corridor of air-cooling and filtration mechanisms (perfected by OCP input) that are necessary for keeping the center's undisclosed number of servers at an ideal temperature. 

A super-simplified explanation of the air treatment process goes like this: Louvers bring outside air in, and it's sent through the first phase of filters, or prefilters. In the next room another wall of louvers directs the air like radiators and dries it out. Beyond that room the air is pushed through wet mesh called a Munters media that cools it back down and then that air is injected into the data hall. Once that air has traveled across the electronics and heated back up, it's stored in a plenum and either recycled, used for heating or dehumidifying, or sent to an exhaust fan.  

Facebook unveils easy-to-deploy datacenter blueprint plan

Microsoft joins Open Compute Project, offers cloud server designs

Open Compute Project: Converged infrastructure promise without lock-in?

ARM server army revs up at Open Compute Project powwow

In addition to the cooling system, Facebook officials touted server design and the use of its cold storage facility as OCP-influenced efficiencies.

For the servers, the Winterfell Web server design slides 86 servers into the three bays of the Open Compute chassis. The OCP's vanity-free design principal also takes effect here, where everything that isn't totally needed for the function of the server is stripped away. For instance, officials noted the absence of the traditional plastic chassis cover, which when used burns 28 watts of fan power to pull air through the impedance caused by the bezel. 

The 90,000 square-foot cold storage facility functions as a digital attic for the more than 400 billion photos uploaded to Facebook, where efficiency comes from stowing away those that are no longer in heavy rotation. Since the photos in here are not as active, the servers don't run as hot and therefore don't require the usual amount of cooling.

Also to the credit of Open Compute, Facebook officials said the team was able to develop a cold storage server that uses microservers that can slide into the SAS expander slot, making room for another 60 drives. 

All of those innovations and efficiencies boil down into Facebook's Open Compute mantra: Community can accelerate the pace of innovation.

"All the industries that contribute to Open Compute — the greater of that knowledge is greater than the one," McCammon said at the end of the tour, as we scooted back to the main building on electric golf carts. 

"The efficiency, the power we are saving with OC designs, the value is there alone. The same thing goes with emissions. We have a society and an environment that we need to take care of, and this is helping us do that." 

Editorial standards