By 2020, a good chunk of the enterprise computing world will be relying on a grand total of 485 data centers across the globe to handle their workloads.
That's the call from Cisco, which just released its latest stats on current and future data center traffic for the world.
The bottom line is cloud computing is really turning the IT market upside down and shaking out all its loose parts. There's a great deal of efficiency and economies of scale in making used of shared services that are commodity in nature. In fact, the Cisco analysis holds, a lot of it will come down to 24 hyperscale operators that aren't necessarily data centers in their own right -- they "might own the data center facility, or it might lease it from a colocation/wholesale data center provider," according to the report's authors.
These hyperscale data centers will grow from 259 in number at the end of 2015 (the latest full year data is available) to 485 by 2020. They will represent 47% of all installed data center servers by 2020. "In other words, they will account for 83% of the public cloud server installed base, and 86 percent of public cloud workloads," the Cisco analysts add.
The authors of Cisco's latest data center traffic report project that within three to four years, 92 percent of workloads will be processed by cloud data centers -- versus only eight percent being processed by traditional data centers.
Is this a good thing, to have so many enterprises relying on so few for their applications and infrastructure? Look at it this way: the rise of packaged software in the 1990s resulted in reliance on a core group of software vendors, such as Microsoft, Oracle, IBM and SAP. Again, there are many functions that can be commoditized and don't need to be re-invented a million times across enterprises.
The real value is what gets built or configured on top of those packages. The success of the business doesn't come from technology itself, but how inspired and forward-thinking management can use that technology to the benefit of customers and employees alike. Plus, with the rise of big data and all things associated with it -- analytics, Internet of Things, security threats -- things are only getting more complicated, requiring scale and expertise that many enterprises simply can't afford on their own.
The Cisco report estimates that globally, the total amount of data stored in data centers will quintuple by 2020 to reach 915 exabytes by 2020, up five-fold 171 exabytes in 2015. Big data will account for 247 exabytes by 2020, up almost 10-fold from 25 exabytes in 2015. Big data alone will represent 27 percent of data stored in data centers by 2020, up from 15 percent in 2015.
The Cisco report notes that a lot of the traffic is increasingly coming from big data and all things associated with it. "The growing importance of data analytics--the result of big data coming from ubiquitously networked end-user devices and IoT alike--has added to the value and growth of data centers," the report's authors observe. "The efficient and effective use of data center technology such as virtualization, new software-based architectures, and management tools and use of public vs. private resources and soon can all add to the agility, success, and competitive differentiation of a business."
By the way, Cisco provides a definition of what it considers to be "hyperscale" provider:
- "More than $1 billion in annual revenue from Infrastructure as a Service (IaaS), Platform as a Service (PaaS), or infrastructure hosting services (for example, Amazon/AWS, Rackspace, Google)"
- "More than $2 billion in annual revenue from software as a service (SaaS) (for example, Salesforce, ADP, Google)"
- "More than $4 billion in annual revenue from Internet, search, and social networking (for example, Facebook, Yahoo, Apple)"
- "More than $8 billion in annual revenue from e-commerce/payment processing (for example,Amazon, Alibaba, eBay)"
Currently, the report notes, 24 hyperscale operators meet the preceding criteria.