X
Business

Google: Rewriting the book on data centers

Om Malik had an interesting post about Google's super-sized datacenters. He posited that Google’s massive infrastructure, customized for its processes, represents a the big barrier to entry for its rivals, and it will continue to give the company an edge as long as it keeps spending the billions on its custom infrastructure.
Written by Dan Farber, Inactive

Om Malik had an interesting post about Google's super-sized datacenters. He posited that Google’s massive infrastructure, customized for its processes, represents a the big barrier to entry for its rivals, and it will continue to give the company an edge as long as it keeps spending the billions on its custom infrastructure.

Indeed, Google is spending billions on equipment and buying up thousands of acres around the globe to better control its own infrastructure destiny and to deliver search results very fast and accurately. The company is spending over several billion dollars to build data centers collectively housing hundreds of thousands of servers running massively parallel computations in power-friendly locations in Iowa, Oklahoma, Oregon and North and South Carolina, and that's just in the U.S.

Google data center under construction in The Dalles, Ore.

 

Lloyd Taylor, the former director of global operations at Google and now vice president of technology operation at LinkedIn, attributes part Google's success to its data center design and science. "Everything goes back to physics," the former member of the Johns Hopkins University Applied Physics Lab told me.

 

Google threw out the book on data centers and went back to basic heat transfer and electrical theory, he explained, eliminating everything not strictly necessary and coming up with a minimalist design.

 

It's similar to how the integrated circuit industry applies a step and repeat concept for building circuit boards, he said. "Buy 1,000 acres in Oklahoma and build a data center and when you need more space, step and repeat, build another one next to it. As the business needs more space, a new data center is up and running in less than six months."

The physics comes in understanding how to gain more efficiency in power and cooling, for example. "Most data centers have multiple transfers between the server, the air around the server, the cooling towers and so forth. Any time you do a transfer through any two medium heat transfer theory will tell you that you are losing efficiency. So, you eliminate the number of transfers and have a very simple connection from the server itself to the cooling towers," Taylor said.

In addition, redundancy is aided by using the software and hardware layers to pay attention to things that aren't working, and proactively redirecting loads. "If a meteor hits in Oregon, you may have to hit reload to get a search result, but that's about it," Taylor said.

Google has patented many of its hardware and software infrastructure innovations, and as Ethan Stock said in blog post, "Google is the server-side doppelganger to Apple, and their platforms like GFS, BigTable, MapReduce, and Sawzall are core to their competitive advantage." The major difference is that Google's servers are behind the curtain and Apple's devices are center stage.

The hosting, backup generator and cooling tower businesses in general are booming. Dupont Fabros Technology, for example, is building a data center (below) in Northern Virginia that takes up 17.15 acres, with a 348,464 square foot facility and 171,200 square feet of raised floor, with output of 36.4 megawatts.

Over time, whatever Google has done to gain more efficiency and cost savings in its data center design will be in textbooks and less of a competitive advantage. The Red Shift, Sun CTO Greg Papadopoulos' idea that we'll end up with a handful of super-scaled data centers powering everything digital on the planet, is a long way off, and there will always be room for more niche operations.

Among the attendees at session at Gartner's Data Center Conference in Las Vegas last month, only five percent said they would outsource their data center operations to a third-party provider.

Like ice gradually melting into the oceans in the Arctic regions, enterprises want to slowly migrate to the cloud, or utility, computing model, with services delivered by massive-scale data centers. But, the tech industry will have its own global warming, pushing companies to move faster to the cloud to stay competitive.

See also: Data Center Knowledge

Editorial standards