The race for your data center has already begun. Google, Microsoft, and Amazon are the leading players in a global data center build-out that has not been slowed by the current economic recession and that over next decade will change the face of both consumer computing and IT departments.
The reason why these three companies are building out data center capacity around the world at a breakneck pace is that they want to be ready with enough capacity to handle the two big developments that are preparing to transform the technology world:
- Cloud computing: Applications and services delivered over the Internet
- Utility computing: On-demand server capacity powered by virtualization and delivered over the Internet
With both of these trends, the biggest target is private data centers. Cloud computing wants to run the big commoditized applications (mail, groupware, CRM, etc.) so that an IT department doesn't have to run them from a private data center.
Utility computing wants to simply take over server capacity for private services and applications, using virtualization to seamlessly scale up and scale down those services so that an organization only has to pay for the bandwidth and server capacity that it uses. What most IT departments do now is pay for maximum capacity at all times, with very low utilization, and also risk downtime at peak times if their systems get overloaded because they haven't planned for enough capacity at the high end.
This is an aerial view of Microsoft's data center in San Antonio, Texas. Microsoft has been adding roughly 10,000 new servers per month as part of the its ambitious data center build-out. Photo credit: Microsoft
It makes perfect sense that Microsoft and Google would want to go after this market. Microsoft already runs a lot of the server software that powers the back office, while Google already has lots of expertise in running data centers to power the Internet. Amazon's place here may appear odd to some, since the company started as an online book seller and has evolved into the Web's biggest mega-retailer.
However, Amazon has arguably become the current market leader in utility computing by using the knowledge it gained in building the infrastructure for its e-commerce business and turning it into Amazon Web Services in which it rents server capacity to other companies. In 2008, CEO Jeff Bezos even revealed that Amazon Web Services now uses more bandwidth than Amazon.com (see chart below).
I also expect IBM and Hewlett-Packard to join the party. While neither of those two are being noticed for building new data centers the way Google, Microsoft, and Amazon are, both of them are in the midst of massive, multi-year data center consolidations, and it's certainly possible that they are quietly building lots of extra capacity as part of those projects. HP has been a long-time proponent of utility computing, and IBM recently gave a public endorsement of cloud computing.
What all of these vendors will argue is that they can save organizations from overprovisioning and overspending on server capacity while also adding 24/7/365 monitoring, scalable load management, and a high level of IT service management (ITSM). Of course, the trade-off is that IT departments give up some control, and usually some staff positions as well.
This is essentially an outsourcing arrangement in which IT turns over a chunk of its operations to a third party. Many companies will be fine with that since IT is probably not one of their core competencies. They will welcome the expertise from a third party and will be happy to find a new way to control IT costs.
However, other IT departments and organizations are going to be far more reluctant to turn over their services, applications, and company data to a vendor. Just last week when Google announced that Google Apps can now use Microsoft Outlook as a client, I asked TechRepublic members, "Would you trust Google with your company's Exchange Server data?"
"I'm not sure it wouldn't land you in prison in many countries for violation of various laws on privacy and data security," wrote Deadly Earnest, an IT consultant in Australia.
"In the UK the Data Protection Act states that you must be able to disclose the location of your data, i.e. at any given moment you at least know what country is," wrote Tom-Tech, a UK software developer. "I'm pretty sure Google doesn't do this as they aren't exactly forthcoming with the locations of their data centers and the data belonging to a specific company probably shifts between a few of them anyway."
Zeplenith, an IT manager in Virginia, asked, "Where is the data stored? Are they SAS-70 certified?"
As such, security, privacy, and compliance are major hurdles that cloud computing and utility computing still must overcome. Nevertheless, we should expect that vendors are well aware of these hurdles and will be working with governments, regulators, and standards agencies to develop services that are fully compliant.
We can also expect that vendors will trip over each other trying to prove which one has the stronger security and privacy policies, because they know those factors are game-breakers. It's not going to happen overnight, but these obstacles will very likely be overcome. The companies involves have invested too much in the future -- and IT has too much to gain from a cost and management perspective -- for these issues to not be resolved.
For more insights on cloud computing and other tech topics, follow my Twitter stream at twitter.com/jasonhiner
For governments, large financial institutions, and other high-security environments, outsourcing the data center will probably never make sense. For virtually everyone else, it's going to become a very attractive option in the next 3-5 years. I suspect that a decade from now running your own data center will be the exception and not the rule, and IT departments will need a strong business case to justify the existence of a private data center.
The massive vendor build-outs currently underway are evidence that day is coming sooner than you might think, and IT leaders should prepare by experimenting with low-priority apps and workloads now, and thinking ahead about which parts of your current infrastructure will make the most sense in utility computing and which parts will present the biggest challenges.