In its Global Cloud Index report (PDF), Cisco predicts that cloud data center traffic will exceed 14 zettabytes in 2020, an increase of 262 percent from 2015. It is no secret that organizations of all sizes are moving to the cloud -- for strategic, financial, and operational flexibility and scalability.
But does this mean an end to company data centers? Hardly.
Researcher Uptime Institute reports that 50 percent of enterprise IT data center budgets have been either flat or shrinking over the last five years and that 55 percent of enterprise server footprints have been flat or shrinking as well. But it also reports that, as of 2016, more than 70 percent of enterprise workloads were still running in corporate data centers, with colocation data centers hosting 20 percent of systems and only 9 percent of systems in the cloud.
My own observation as an IT consultant is that the move to the cloud is more aggressive in companies than the Uptime survey reflects, but there is one undeniable fact that I observe whether I'm visiting a large enterprise with a multimillion-dollar data center or a small shop with its phone system, communications bank, and a single applications server in a little 'data center' closet: Nearly everyone has some type of in-house computing.
So why are businesses hanging on to their data centers?
If you've ever stood in front of your board of directors as a small company CEO or as the CIO of a larger company, and you've had to explain why your systems were down or how your network got invaded by malware, you understand why companies are hesitant to let go of all of their systems and data to third parties. Having direct management of your mission-critical IT assets is still best practice in companies.
"The biggest risk is giving up control of your data to someone else using different data centers in remote places, " said Gavan Egan, managing director of cloud and IT solutions for Verizon. "What happens in the event of a disaster? You're also putting your data next to someone else's."
In highly regulated industries like healthcare, insurance, and finance, when your industry examiner visits and asks you about IT security and you tell them that you're using third-party cloud-based systems, they're going to ask you whether they can look at the cloud provider's third-party audit report.
With the plethora of cloud offerings available in today's marketplace, not every cloud provider conducts formal (and expensive) outside audits of its IT, so the provider may not have an independent third-party audit it can furnish you with as a customer. The result is that you receive a lower IT security rating from your own examiner, who is likely to feel that you are exposing yourself to more risk than you should.
These worries about cloud security and governance aren't unfounded. This is a major reason why many companies using cloud services opt for colocation, which gives the company access to the cloud vendor's building, cooling, power, bandwidth, and physical security to save on data center costs, while the company installs its own servers and storage -- and maintains direct control over its systems.
What happens if you subscribe to a cloud-based software service and the vendor of the service doesn't fully own and operate its own data centers? It's not a problem -- unless there is an outage at your cloud provider's data center provider, and you don't have a direct relationship with that data center provider. In contract law, this is called 'lack of privity'. In other words, you lack a direct contractual agreement with the underlying data center vendor for your cloud-based software, so you have no leverage to enforce a contract when the fault of the outage lies with the data center and not your primary vendor.
This also means that you have increased liability exposure and business risk.
When companies choose to colocate their IT with cloud vendors and contract directly for the cloud data center services and maintain their own IT, this risk is reduced. It is also a reason why many companies, when they shop for cloud solutions, ask prospective vendors on their RFPs whether the vendors own and operate their own data centers.
Most cloud-based vendors in the business applications space run multi-tenant computing models where hundreds or even thousands of clients share a common business application (ERP, CRM, sales, for example). The application system is upgraded continuously, quarterly or annually, based on the enhancement requests that come into the vendor from clients. Most of these vendor application systems also offer opportunities for companies to customize -- but the caveat is usually that companies must share their new custom code development with other clients. For companies that develop custom applications that are highly proprietary and contributory to competitive advantage, this can be an unacceptable computing model. Instead, they opt to keep these mission-critical systems (and their intellectual property) in-house.
Cloud vendors have made progress since the days of outages and weak governance that gave corporate IT pause when it came to using the cloud for failing over systems for disaster recovery, 24/7 uptime, and premium system performance. But some of these performance challenges are still there. Even if your cloud vendor performs in an exemplary fashion with respect to system performance and DR/failover, there is still the worry that the underlying hardware and software at the cloud vendor's data center will not stay in exact sync with the hardware and software you are running in your own data center. These small differences in hardware and software can cause degradations in system performance.
Your internal users might grouse about slow IT responses to open help desk tickets, but few will deny that it feels a lot better to pick up the phone and call someone who is down the hall in the IT data center instead of making a call to a cloud provider support center that is half a world away and uses an automated phone or chat system. For companies in industries where it is vital that systems stay up 24/7 with rapid response times (hotel and airline reservations, for example), supporting your own systems with your own people is vital. Many of these companies have also fine-tuned their systems for speed and resiliency. They did it with the help of seasoned pros on their own IT staffs who know the ins and outs of their applications -- a level of expertise that a more generic cloud-based service simply can't provide. Companies like these choose to either maintain systems in their own data centers or to co-locate systems in the cloud -- with their own staffs running these systems.
Gartner predicts that the public cloud services market alone will grow by 18 percent in 2017 to total of $246.8 billion. This affirms the strength of cloud computing in corporate IT plans. But as companies migrate more of their computing to the cloud and data center investments either recede or stay flat, there are many reasons for companies to maintain critical IT assets on prem. That is not likely to change anytime soon.
What has proven advantageous for today's IT decision makers is that they can really have the best of both the cloud-based and on-prem worlds. They can do this by engaging in a hybrid cloud computing strategy that combines elements of private cloud, public cloud, and on-premises computing in a total enterprise IT infrastructure. This is why 85 percent of companies in 2016 said they were using multiple clouds for their IT. Just as pivotal to the strategy is maintaining an internal data center presence.