X
Tech

eBay datacenter chief Dean Nelson: 'We are living Moore's Law'

eBay datacenter chief Dean Nelson says that with $2,000 of goods being traded every second on eBay, the company can't afford not to have the fastest, most powerful and most efficient servers.
Written by Melanie D.G. Kaplan, Inactive

In May, as part of its data center consolidation strategy, eBay opened Project Topaz in South Jordan, Utah—its largest and most efficient data center. The site, owned (rather than leased) by eBay, encompasses 60 acres and allows the company to eventually quadruple the size of its current operation.

Dean Nelson is eBay's senior director of Global Data Center Strategy and Operations. He was previously the senior director of Global Lab & Data Center Design Services at Sun Microsystems, and his job today focuses on the mission critical infrastructure for eBay, PayPal, Shopping.com and other eBay businesses. I recently talked with him about using leased IT equipment to increase efficiency at Topaz.

Tell me about eBay's tech refresh cycle.

We have a process internally in which we lease equipment and replace it every two years with the newest, most powerful equipment. It used to be that costs and power consumption were at pace, and we have now stopped that and broken the linear relationship. We are living Moore's Law. It comes down to how much work can get done. Our computing efforts have doubled, and the cost is flat.

Who leases you the equipment?

We have multiple vendors. We're always splitting between two vendors. Everyone has a chance to play.

So every two years all the equipment is replaced?

The two-year tech refresh is staggered. It’s not like on this one day every two years everything is replaced. Every month something is gong in and out of the data center. The center has to be extremely flexible for that to happen.

Have you always leased hardware?

No. We've done this over the last three years. The leasing model is fairly unique. We're forcing the financial system and internal process, saying this equipment will go away every two years. That equipment for most companies is still really good, but for us we have to be able to deliver for the 90 million customers. We do more transactions than the New York Stock Exchange on our site.

None of this is wasted. We put a business model in place that allows us to optimize cost, efficiency and performance. At $2,000 per second of goods being traded, we need a really efficient engine to be able to deliver. So this data center is all those different elements coming together.

I joined the company 11 months ago. One of the main reasons I joined is that the organizational structure at eBay is different than at most other companies. The data center is in the same side as the IT side: one VP, one budget, one goal. Everything lines up neatly. We're looking at it holistically, so we can see how efficiently we can run this machine.

You have also donated some of your obsolete servers?

We had an agreement with [the University of] Notre Dame where we were donating servers to them for medical research. We donated over 100 servers, which they used in research labs for AIDS and cancer research.

What other technologies and programs have you implemented to make the data center so efficient?

For electrical, we're using 400 volts; the higher the voltage on a server, the more efficient it runs. We decided to go higher voltage everywhere in the data center, which is 2 percent more efficient. It has to step down the voltage. Imagine coming in from the substation, which is 138,000 volts, and you have to keep stepping that down, which means you lose efficiency. This eliminates a layer of transformers and increases efficiency.

We have power distribution through Starline Busway. We can snap in the power connection within minutes instead of calling an electrician. In what used to take weeks, we can have a technician just snap something in a power plug. Everything is visible; we can make decisions very quickly. We can make decisions that do not require a human. The system lets us know.

And should something go wrong, you have backups to the backups.

Right. For example, the dynamic cooling is remotely monitored. Say certain components fail. Another one picks it up. We have an extra one, plus one.

How is Utah, as a data center host state?

Utah's been great. There's a data center community. We’ve reached out to our neighbors here and built up a group to share information and collaborate.

Data Center Pulse is a nonprofit data center industry community. So we started a local chapter in April. The first meeting we had was a meet and greet, and those nine who were there represented 40 megawatts worth of consumption. Those who couldn't attend represented another 160 megawatts. NSA [National Security Administration) putting $1.6 billion into a data center in Utah, Oracle is building a center.

So what have you learned?

We all have the same challenges: We all have costs issues, density issues, sustainability goals. The knowledge sharing in those sessions is huge. eBay is transparent. We’re now trying to get our peers together to share best practices.

Interested in more? Read a related interview about Project Topaz'and the value of greener datacenters on SmartPlanet's Pure Genius blog.

Editor's Note: The original version of this post said that the facility is located in St. George; it is actually South Jordan. additionally, the datacenter's substation is 380,000 volts, not 138. The post has been corrected.

Editorial standards