/>
X

Facebook's data centers worldwide, by the numbers and in pictures

Grounded in open source and energy efficient designs, the world's largest social network shared its latest updates for cost and power savings attributed to its cutting-edge datacenters.
|
rachel-king-640x465.jpg
|
prineville-data-center-4.jpg
1 of 17 Rachel King/ZDNet

A status update on Facebook's homegrown datacenters

With more than 1.32 billion users and counting, Facebook is arguably at the forefront of burgeoning but still nascent social media world. But the Menlo Park, Calif.-headquartered company is also paving the way in a manner in which most of its global membership base might not be as familiar.

While Google, Microsoft and Amazon Web Services clamor over hosting the cloud needs of its Silicon Valley neighbors and social media darlings from Pinterest to Pulse, Facebook has been building out its own datacenter footprint over the last few years.

Emphasizing inspirations from open source to energy efficiency, the world's largest social network shared its latest updates with ZDNet, revealing cost and power savings attributed to its cutting-edge datacenter designs.

All images via Facebook

prineville-data-center-3.jpg
2 of 17 Rachel King/ZDNet

Facebook: Prineville, Oregon

Over the past three years, Facebook boasted it has saved more than $1.2 billion by optimizing its full stack for the datacenter, hardware, and software.

Pictured above, Facebook's flagship datacenter building in Prineville, Oregon was constructed with 950 miles of wire and cable — touted to be equivalent to the distance between Boston and Indianapolis.

prineville-data-center-5.jpg
3 of 17 Rachel King/ZDNet

Facebook: Prineville, Oregon

The Prineville facility is made out of 1,560 tons of steel, equal in weight to approximately 900 mid-size cars.

All in all, if you stood Prineville’s Building 1 (with a footprint of roughly 332,930 sq. ft.) end-to-end, it would equate to an 81-story building.

prineville-data-center-6.jpg
4 of 17 Rachel King/ZDNet

Facebook: Prineville, Oregon

Prineville was Facebook's first datacenter deployed using Open Compute Project designs. When it started serving traffic, Facebook said it was 38 percent more energy-efficient than its leased capacity at the time, lowering operational costs by up to 24 percent.

altoona-data-center-1.jpg
5 of 17 Rachel King/ZDNet

Facebook: Altoona, Iowa

The Facebook Altoona datacenter campus is 202 acres, described to be 42 acres larger than Disneyland.

altoona-data-center-2.jpg
6 of 17 Rachel King/ZDNet

Facebook: Altoona, Iowa

Fun fact: If you had enough ping pong balls, Facebook estimated it could fit 6.4 billion of them in Altoona Data Center Building One.

altoona-data-center-8.jpg
7 of 17 Rachel King/ZDNet

Facebook: Altoona, Iowa

Altoona 1 plans were first unveiled more than a year ago. Since then, more than 460 people have worked on the project, logging more than 435,000 hours in the ongoing construction of the 476,000-square-foot building.

Pending local council approval, Facebook is planning on breaking ground on the second datacenter building designed to mirror the first, aptly named Altoona 2.

 

forest-city-1.jpg
8 of 17 Rachel King/ZDNet

Facebook: Forest City, North Carolina

The rural 160-acre campus in Forest City, N.C. opened in 2012, taking the building blocks of the Open Compute Project to a new level.

Thanks to design efficiencies attributed to OCP, Facebook said it saved $1.2 billion in infrastructure costs — enough energy to power 40,000 homes for a year or the carbon equivalent of taking 50,000 cars off the road.

forest-city-2.jpg
9 of 17 Rachel King/ZDNet

Facebook: Forest City, North Carolina

To demonstrate how these energy and cost savings happen, Facebook explained it reuses computer server heat by taking a portion of the excess heat and using it to heat office space during colder months.

forest-city-3.jpg
10 of 17 Rachel King/ZDNet

Facebook: Forest City, North Carolina

An evaporative cooling system is used to evaporate water to cool the incoming air — as opposed to traditional chiller systems that require more energy intensive equipment. This process is championed as highly energy efficient, minimizing water consumption by using outside air.

forest-city-4.jpg
11 of 17 Rachel King/ZDNet

Facebook: Forest City, North Carolina

The Forest City center runs 100 percent on outdoor air, saving on heating and cooling costs from power-hungry air handlers.

lulea-rddc-construction.jpeg
12 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

When in Sweden, Facebook is evidently doing as the Swedish do. Facebook's first facility abroad is (no joke) taking an Ikea-like approach with its own out-of-the-box, pre-fab datacenter blueprint.

lulea-rddc-rendering.jpeg
13 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

Dubbed Rapid deployment datacenter design (RDDC), the guide takes modular and lean construction principles and applies them at the scale of a Facebook datacenter.

The RDDC design is based on two concepts: A structural frame is built before all the components, from lighting to cables, are attached on an assembly line in a factory. The entire construct is then driven to the building site on the back of a flatbed truck. 

Facebook believes this will enable it to deploy two data halls in the time it previously took to deploy one while also cutting back greatly on the amount of material required for construction.

 

lulea-exhaust-fans.jpg
14 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

Facebook design engineer Marco Magarelli admitted in a blog post back in March that the RDDC design actually started out as a hack.

"Our previous datacenter designs have called for a high capacity roof structure that carries the weight of all our distribution and our cooling penthouse; this type of construction requires a lot of work on lifts and assembly on site," Magarelli wrote. "Instead, as Ikea has done by packing all the components of a bookcase efficiently into one flat box, we sought to develop a concept where the walls of a datacenter would be panelized and could fit into standard modules that would be easily transportable to a site."

lulea-interior.jpg
15 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

Since Facebook started deploying its open hardware, the social network estimated it has saved enough energy to power more than 40,000 homes for a year.

lulea-data-center-4.jpg
16 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

Supported by the power of datacenters like these, Facebook noted it sees an average of six billion likes per day alone. Over the last 10 years, Facebook's datacenters have seen more than 400 billion photos shared and 7.8 trillion messages sent.

lulea-data-center-5.jpg
17 of 17 Rachel King/ZDNet

Facebook: Luleå, Sweden

Facebook is currently testing the chassis approach at its second building under construction at the Luleå campus. Spanning about 125,000 sq. ft., it will be the first Facebook datacenter building to feature the RDDC design upon completion.

Related Galleries

Apricorn Aegis Secure Key 3NXC
Apricorn Aegis Secure Key 3NXC

Related Galleries

Apricorn Aegis Secure Key 3NXC

ioSafe Duo: Fireproof and water resistant data storage (in pictures)
ioSafe Duo

Related Galleries

ioSafe Duo: Fireproof and water resistant data storage (in pictures)

When chatbots are a very bad idea
When Chatbots are a very bad idea ZDNet

Related Galleries

When chatbots are a very bad idea

Hybrid transactional analytical processing
htap1.png

Related Galleries

Hybrid transactional analytical processing

Insight Platforms as a Service
ipaas1.png

Related Galleries

Insight Platforms as a Service

Streaming becomes mainstream
streaming1.png

Related Galleries

Streaming becomes mainstream

The machine learning feedback loop
ml1.png

Related Galleries

The machine learning feedback loop