Search
  • Videos
  • Windows 10
  • 5G
  • CES
  • Best VPNs
  • Cloud
  • Security
  • more
    • AI
    • TR Premium
    • Working from Home
    • Innovation
    • Best Web Hosting
    • ZDNet Recommends
    • Tonya Hall Show
    • Executive Guides
    • ZDNet Academy
    • See All Topics
    • White Papers
    • Downloads
    • Reviews
    • Galleries
    • Videos
    • TechRepublic Forums
  • Newsletters
  • All Writers
    • Preferences
    • Community
    • Newsletters
    • Log Out
  • Menu
    • Videos
    • Windows 10
    • 5G
    • CES
    • Best VPNs
    • Cloud
    • Security
    • AI
    • TR Premium
    • Working from Home
    • Innovation
    • Best Web Hosting
    • ZDNet Recommends
    • Tonya Hall Show
    • Executive Guides
    • ZDNet Academy
    • See All Topics
    • White Papers
    • Downloads
    • Reviews
    • Galleries
    • Videos
    • TechRepublic Forums
      • Preferences
      • Community
      • Newsletters
      • Log Out
  • us
    • Asia
    • Australia
    • Europe
    • India
    • United Kingdom
    • United States
    • ZDNet around the globe:
    • ZDNet France
    • ZDNet Germany
    • ZDNet Korea
    • ZDNet Japan

Facebook's data centers worldwide, by the numbers and in pictures

6 of 17 NEXT PREV
  • A status update on Facebook's homegrown datacenters

    A status update on Facebook's homegrown datacenters

    With more than 1.32 billion users and counting , Facebook is arguably at the forefront of burgeoning but still nascent social media world. But the Menlo Park, Calif.-headquartered company is also paving the way in a manner in which most of its global membership base might not be as familiar.

    While Google, Microsoft and Amazon Web Services clamor over hosting the cloud needs of its Silicon Valley neighbors and social media darlings from Pinterest to Pulse, Facebook has been building out its own datacenter footprint over the last few years .

    Emphasizing inspirations from open source to energy efficiency , the world's largest social network shared its latest updates with ZDNet, revealing cost and power savings attributed to its cutting-edge datacenter designs .

    All images via Facebook

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Prineville, Oregon

    Facebook: Prineville, Oregon

    Over the past three years, Facebook boasted it has saved more than $1.2 billion by optimizing its full stack for the datacenter, hardware, and software.

    Pictured above, Facebook's flagship datacenter building in Prineville, Oregon was constructed with 950 miles of wire and cable — touted to be equivalent to the distance between Boston and Indianapolis.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Prineville, Oregon

    Facebook: Prineville, Oregon

    The Prineville facility is made out of 1,560 tons of steel, equal in weight to approximately 900 mid-size cars.

    All in all, if you stood Prineville’s Building 1 (with a footprint of roughly 332,930 sq. ft.) end-to-end, it would equate to an 81-story building.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Prineville, Oregon

    Facebook: Prineville, Oregon

    Prineville was Facebook's first datacenter deployed using Open Compute Project designs . When it started serving traffic, Facebook said it was 38 percent more energy-efficient than its leased capacity at the time, lowering operational costs by up to 24 percent.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Altoona, Iowa

    Facebook: Altoona, Iowa

    The Facebook Altoona datacenter campus is 202 acres, described to be 42 acres larger than Disneyland.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Altoona, Iowa

    Facebook: Altoona, Iowa

    Fun fact: If you had enough ping pong balls, Facebook estimated it could fit 6.4 billion of them in Altoona Data Center Building One.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Altoona, Iowa

    Facebook: Altoona, Iowa

    Altoona 1 plans were first unveiled more than a year ago. Since then, more than 460 people have worked on the project, logging more than 435,000 hours in the ongoing construction of the 476,000-square-foot building.

    Pending local council approval, Facebook is planning on breaking ground on the second datacenter building designed to mirror the first, aptly named Altoona 2 .

     

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Forest City, North Carolina

    Facebook: Forest City, North Carolina

    The rural 160-acre campus in Forest City, N.C. opened in 2012, taking the building blocks of the Open Compute Project to a new level .

    Thanks to design efficiencies attributed to OCP, Facebook said it saved $1.2 billion in infrastructure costs — enough energy to power 40,000 homes for a year or the carbon equivalent of taking 50,000 cars off the road.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Forest City, North Carolina

    Facebook: Forest City, North Carolina

    To demonstrate how these energy and cost savings happen, Facebook explained it reuses computer server heat by taking a portion of the excess heat and using it to heat office space during colder months.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Forest City, North Carolina

    Facebook: Forest City, North Carolina

    An evaporative cooling system is used to evaporate water to cool the incoming air — as opposed to traditional chiller systems that require more energy intensive equipment. This process is championed as highly energy efficient, minimizing water consumption by using outside air.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Forest City, North Carolina

    Facebook: Forest City, North Carolina

    The Forest City center runs 100 percent on outdoor air, saving on heating and cooling costs from power-hungry air handlers.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    When in Sweden, Facebook is evidently doing as the Swedish do. Facebook's first facility abroad is (no joke) taking an Ikea-like approach with its own out-of-the-box, pre-fab datacenter blueprint .

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    Dubbed Rapid deployment datacenter design (RDDC), the guide takes modular and lean construction principles and applies them at the scale of a Facebook datacenter.

    The RDDC design is based on two concepts: A structural frame is built before all the components, from lighting to cables, are attached on an assembly line in a factory. The entire construct is then driven to the building site on the back of a flatbed truck. 

    Facebook believes this will enable it to deploy two data halls in the time it previously took to deploy one while also cutting back greatly on the amount of material required for construction.

     

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    Facebook design engineer Marco Magarelli admitted in a blog post back in March that the RDDC design actually started out as a hack.

    "Our previous datacenter designs have called for a high capacity roof structure that carries the weight of all our distribution and our cooling penthouse; this type of construction requires a lot of work on lifts and assembly on site," Magarelli wrote. "Instead, as Ikea has done by packing all the components of a bookcase efficiently into one flat box, we sought to develop a concept where the walls of a datacenter would be panelized and could fit into standard modules that would be easily transportable to a site."

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    Since Facebook started deploying its open hardware, the social network estimated it has saved enough energy to power more than 40,000 homes for a year.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    Supported by the power of datacenters like these, Facebook noted it sees an average of six billion likes per day alone. Over the last 10 years, Facebook's datacenters have seen more than 400 billion photos shared and 7.8 trillion messages sent.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

  • Facebook: Luleå, Sweden

    Facebook: Luleå, Sweden

    Facebook is currently testing the chassis approach at its second building under construction at the Luleå campus. Spanning about 125,000 sq. ft., it will be the first Facebook datacenter building to feature the RDDC design upon completion.

    Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

    Caption by: Rachel King

6 of 17 NEXT PREV
Rachel King

By Rachel King for Between the Lines | August 29, 2014 -- 19:00 GMT (12:00 PDT) | Topic: Data Management

  • A status update on Facebook's homegrown datacenters
  • Facebook: Prineville, Oregon
  • Facebook: Prineville, Oregon
  • Facebook: Prineville, Oregon
  • Facebook: Altoona, Iowa
  • Facebook: Altoona, Iowa
  • Facebook: Altoona, Iowa
  • Facebook: Forest City, North Carolina
  • Facebook: Forest City, North Carolina
  • Facebook: Forest City, North Carolina
  • Facebook: Forest City, North Carolina
  • Facebook: Luleå, Sweden
  • Facebook: Luleå, Sweden
  • Facebook: Luleå, Sweden
  • Facebook: Luleå, Sweden
  • Facebook: Luleå, Sweden
  • Facebook: Luleå, Sweden

Grounded in open source and energy efficient designs, the world's largest social network shared its latest updates for cost and power savings attributed to its cutting-edge datacenters.

Read More Read Less

Facebook: Altoona, Iowa

Fun fact: If you had enough ping pong balls, Facebook estimated it could fit 6.4 billion of them in Altoona Data Center Building One.

Published: August 29, 2014 -- 19:00 GMT (12:00 PDT)

Caption by: Rachel King

6 of 17 NEXT PREV

Related Topics:

Big Data Analytics Innovation CXO Artificial Intelligence Enterprise Software Storage
Rachel King

By Rachel King for Between the Lines | August 29, 2014 -- 19:00 GMT (12:00 PDT) | Topic: Data Management

Show Comments
LOG IN TO COMMENT
  • My Profile
  • Log Out
| Community Guidelines

Join Discussion

Add Your Comment
Add Your Comment

Related Galleries

  • 1 of 3
  • Apricorn Aegis Secure Key 3NXC

    The new Aegis Secure Key 3NXC builds on Apricorn's Secure Key 3z and Aegis Secure Key 3NX, taking the same proven form-factor and physical keypad, and adding something that users have ...

  • ioSafe Duo: Fireproof and water resistant data storage (in pictures)

    The ioSafe Duo brings peace of mind to onsite storage.

  • When chatbots are a very bad idea

    Not every business problem can be solved by using chatbots. Here are some inappropriate uses for the AI tool.

  • Hybrid transactional analytical processing

    Traditionally, operational databases and platforms for data analysis have been two different worlds. This has come to be seen as natural, as after all the requirements for use cases ...

  • Insight Platforms as a Service

    Remember how we noted data is going the way of the cloud? While there are no signs of this slowing down, there's another interesting trend unraveling, the so-called Insight Platforms ...

  • Streaming becomes mainstream

    The endless streams of data generated by applications lends its name to this paradigm, but also brings some hard to deal with requirements to the table: How do you deal with querying ...

  • The machine learning feedback loop

    The pace of change is catalyzed and accelerated at large by data itself, in a self-fulfilling prophecy of sorts: data-driven product -> more data -> better insights -> more profit ...

ZDNet
Connect with us

© 2021 ZDNET, A RED VENTURES COMPANY. ALL RIGHTS RESERVED. Privacy Policy | Cookie Settings | Advertise | Terms of Use

  • Topics
  • Galleries
  • Videos
  • Sponsored Narratives
  • Do Not Sell My Information
  • About ZDNet
  • Meet The Team
  • All Authors
  • RSS Feeds
  • Site Map
  • Reprint Policy
  • Manage | Log Out
  • Join | Log In
  • Membership
  • Newsletters
  • Site Assistance
  • ZDNet Academy
  • TechRepublic Forums