Facebook outlines history, infrastructure behind Timeline

Summary:Facebook's Timeline originally started as a hackathon project and evolved into completely changing the way the profile pages look today.

What originally started as a hackathon project and evolved into completely changing the way the profile pages look today, Facebook's Timeline required a lot of changes in the infrastructure behind the scenes.

First, the project commenced with a four-person team consisting of two full-time engineers, an engineering intern, and a designer. Together, they managed to build a working demo in a single night. (That's amazing when you consider how long it took for Timeline to roll out to users between that hackathon night in 2010, to the announcement in September 2011, to finally debuting live in December.)

Eventually, the team was obviously bumped up, and by early 2011, it included a full development team split up into groups covering design, front-end engineering, infrastructure engineering, and data migrations.

At some point, the team determined that it needed to build a completely new infrastructure that could handle displaying data from past weeks and years rather than just what was present from the previous few days.

Facebook infrastructure engineer Ryan Mack explained on the Facebook blog that one of the "key priorities was eliminating technical risk by keeping the system as simple as possible and relying on internally-proven technologies."

Basically, Timeline is built upon four of Facebook's core technologies:

  • MySQL/InnoDB for storage and replication
  • Multifeed (the technology that powers News Feed) for ranking
  • Thrift for communications
  • Memcached for caching

One of the solutions and keys to the infrastructure is the Timeline aggregator, built on top of a database and provides a set of story generators supports everything from "geographically clustering nearby check-ins to ranking status updates." Essentially, this is how the Timeline determines which stories should be larger than others, what gets prioritized -- an automatic page layout editor of sorts.

This also includes caching, which is immensely important considering how much past (and future) data needs to be considered if it is to be so easily accessible for 800 million members and counting.

Thus, that's where Memcached comes in, and Mack boasts that results from big queries (i.e. all posts in a single year) can be cached compactly yet for a long period of time.

For a more in-depth and incredibly detailed version of how this all went down, check out Mack's post on the official Facebook blog.

Related:

Topics: Social Enterprise

About

Rachel King is a staff writer for CBS Interactive based in San Francisco, covering business and enterprise technology for ZDNet, CNET and SmartPlanet. She has previously worked for The Business Insider, FastCompany.com, CNN's San Francisco bureau and the U.S. Department of State. Rachel has also written for MainStreet.com, Irish Americ... Full Bio

zdnet_core.socialButton.googleLabel Contact Disclosure

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.