X
Tech

Google's huge data centers: the IT failures question

Harper's has published a blueprint drawing for Google's new Oregon data center. Dan Farber points out that, "the 68,680 square-foot facility...
Written by Michael Krigsman, Contributor
GoogleÂ’s huge datacenters: the IT failures question

Harper's has published a blueprint drawing for Google's new Oregon data center. Dan Farber points out that, "the 68,680 square-foot facility...is expected to demand 103-megawatts of electricity, which would power about 80,000 homes."

Meanwhile, AFP says, "Google was looking at Malaysia, India or Vietnam to establish the world's biggest server farm." Ordinarily, I wouldn't put much credence behind such speculations, but AFP is a well-established news organization, having been around since 1835.

From a project failures perspective, I wonder about the risk such concentrated operations pose for application downtime and reliability. Despite the best-laid plans, data centers do suffer outages and real customers are affected in the aftermath. Think of the massive impact created by failure at such a huge facility.

As IT steadily migrates into the Software as a Service (SaaS) cloud, data center and web reliability issues will become increasingly important. Phil Wainewright correctly asserts:

[Cloud customers] will be looking for far better outage management and service level reporting in the future than they’ve tolerated to date.

Expect to hear more on this subject in the IT Project Failures blog. In the meantime, please comment with your opinion regarding the impact of large data centers on IT failure.

Editorial standards