X
Business

5 green data center tips from eBay's Project Mercury

Despite its location in scorching Phoenix, Ariz., the online marketplace's latest facility can be tuned to operate at a PUE of less than 1.2.
Written by Heather Clancy, Contributor

I won't be able to attend the Uptime Institute's annual symposium in Santa Clara, Calif., about green IT best practices, so I requested an interview with the keynote speaker for the first dayeBay’s Vice President of Global Foundation Services, Dean Nelson.

Nelson and his team are responsible for eBay's Project Mercury, the initiative that created one of the world's most energy-efficient data centers -- which also happens to be located in scorching Phoenix, Arizona. This matters because eBay is one of the most data center-intensive companies in business today.

In 2011, the company had an average of 300 million items offered for sale at any given moment on its online marketplace. It supports 100 million active users and processes $2,000 in transactions every second. Data center efficiency equals operational efficient for eBay. Nelson's ongoing goal: to use the smallest amount of watts possible to run the highest possible number of transcations.

Project Mercury, which is the subject of a lengthy case study by the Green Grid, is a testament to the effectiveness of server standardization, data center containerization and the practice of free cooling, despite the climate. The initiative just won a 2012 Green Enterprise IT Award from Uptime Institute. Much of the innovation centers on the modular data center that is on the facility's rooftop (which you can see below).

As I chatted with Nelson (for close to an hour), we touched on 5 best practices from Project Mercury that could also make sense in your own data center. I encourage you to read the entire Green Grid case study for more details than I can possibly post here, but here are some of my takeaways.

  • Get your facilities and IT teams on the same page, from the beginning. The cooling needs of the server and infrastructure technology you are buying cannot be an afterthought. "I pay the power bill AND I buy the hardware," Nelson said.
  • Make the most of your location. OK, so the desert of Arizona may not seem like a smart place to build a data center, but the realities of Internet connectivity and latency were what dictated the siting decision. "I am bound technically by this," Nelson said. Despite that fact, eBay's data center is able to use free cooling year-round and still maintain a power usage effectiveness (PUE) of less than 1.2 (its minimum design requirement for the project).
  • Standardize server configurations. One of the biggest a-ah moments for the eBay team was the realization that there were more than 200 different server SKUs. When it analyzed the configurations, it determined that approximately 80 percent could be served by 15 of the architectures. Now, it is down to just two main configurations. "By having those two SKUs, we can optimize the container configuration and cooling," Nelson said.
  • Think modular. Speaking of containers, the eBay data center features a modular data center designed in partnership with Dell. The measured PUE for this container is 1.043; it makes use of free cooling all all at rack densities of up to 26 kilowatts, despite outdoor temperatures of up to 119 degrees Fahrenheit during the summer. The modular design also afforded eBay the ability to deploy the servers in less than six months. Modularity helps with tuning. As an example, when eBay reduced the maximum consumption of the racks down to 26 kilowatts from 28 kilowatts, it cut energy load by 7 percent for a monthly savings of $1,000 on its electricity bill. Another benefit of modularity: the racks can support up to five generations of future technologies, which reduces long-term operational costs.
  • Consider the total cost of ownership. Yes, you've heard this before. But literally one of the deciding factors in eBay's vendor choice was the amount of electricity that servers used when they were idling. If eBay had considered only the hardware procurement costs of the servers it used for Project Mercury, it would have chosen a different vendor, Nelson said.

Editorial standards