Cloud computing: 10 ways it will change by 2020

What are the issues, challenges and technologies that will frustrate and inspire those working on the cloud in 2020?

Right now we are in the early days of cloud computing, with many organisations taking their first, tentative steps. But by 2020 cloud is going to be a major — and permanent — part of the enterprise computing infrastructure.

Eight years from now we are likely to see low-power processors crunching many workloads in the cloud, housed in highly automated datacentres and supporting massively federated, scalable software architecture.

Cloud 2020
What form will cloud computing take in the year 2020?

Analyst group Forrester expects the global cloud computing market will grow from $35bn (£22.5bn) in 2011 to around $150bn by 2020 as it becomes key to many organisations' IT infrastructures.

Alongside this increase in demand from enterprise, there will be development in the technologies that support clouds, with rapid increases in processing power making cloud projects even cheaper, while technologies currently limited to supercomputing will make it into the mainstream.

READ THIS:  Cloud 2020: What are the barriers to the cloud?

And of course, by 2020, a generational shift will have occurred in organisations that means a new generation of CIOs will be in charge who have grown up using cloud-based tools, making them far more willing to adopt cloud on an enterprise scale.

With all these developments in mind, here are 10 ways in which the cloud of 2020 will look radically different to the way it does today, as gleaned from the experts I've spoken to.

1. Software floats away from hardware

John Manley, director of HP's Automated Infrastructure Lab, argues that software will become divorced from hardware, with more and more technologies consumed as a service: "Cloud computing is the final means by which computing becomes invisible," he says.

As a result, by 2020, if you were to ask a CIO to draw a map of their infrastructure, they would not be able to, says David Merrill, chief economist of Hitachi Data Systems . "He will be able to say 'here are my partner providers'," he says, but he will not be able to draw a diagram of his infrastructure.

This will be because it will be in a "highly abstracted space", where software is written in such a way that it goes through several filters before it interacts with hardware. This means that front-end applications, or applications built on top of a platform-as-a-service, will be hardware agnostic.

2. Modular software

To take advantage of the huge armadas of hardware available via clouds, individual software applications are set to get larger and more complex as they are written to take advantage of scale.

With the growth in the size and complexity of individual programs, the software development process will place an emphasis on modular software — as in, large applications with components that can be modified without shutting down the program.

As a consequence, cloud applications will require a new programming mindset, especially as they interact with multiple clouds.

"Software has to be thought about differently," HP's Manley says, arguing that the management of federated services will be one of the main 2020 challenges. This is because applications are not only going to be based in the cloud, but will hook into other clouds and various on-premise applications as well.

In other words, different parts of applications will "float around" in and out of service providers. Assuring good service-level agreements for these complex software packages will be a challenge, Manley says.

3. Social software

Along with the modular shift, software could take on traits currently found in social-media applications like Facebook, says Merrill. Programs could form automatic, if fleeting, associations with bits of hardware and software according to their needs.

"It will be a social-media evolution," Merrill says. "You will have an infrastructure. It'll look like a cloud, but we will engineer these things so that a database will 'like' a server, [or] will 'like' a storage array."

In other words, the infrastructure and software of a datacentre will mould itself around the task required, rather than the other way around. Developers will no longer need to worry about provisioning storage, a server and a switch, Merrill says: all of this will happen automatically. 

4. Commodity hardware rules

By 2020 the transition to low-cost hardware will be in full swing as schemes such as the Open Compute Project find their way out of the datacentres of Facebook and Amazon Web Services and into facilities operated by other, smaller companies as well. "Servers and storage devices will look like replaceable sleds," says Frank Frankovsky, Facebook's VP of hardware design and supply chain, and chairman of the Open Compute Project .

"Cloud computing is the final means by which computing becomes invisible" — John Manley, HP

By breaking infrastructure down into its basic components, replacements and upgrades can be done quickly, he says. The companies best placed to use this form of commoditised infrastructure are large businesses that operate huge datacentres. "I would say that between now and 2020, the fastest-growing sector of the market is going to be cloud service providers," Frankovsky says.

5. Low-power processors and cheaper clouds

We're around a year away from low-power ARM chips coming to market with a 64-bit capability, and once that happens uptake should accelerate, as enterprise software will be developed for the RISC chips, allowing companies to use the power-thrifty processors in their datacentres, and thereby cut their electricity bills by an order of magnitude.

HP has created a pilot server platform — Redstone — as part of its Project Moonshot scheme to try to get ARM kit to its customers, while Dell has been selling custom ARM-based servers to huge cloud customers via its Data Center Solutions group for years.

By 2020 it's likely that low-power chips will be everywhere. And it won't just be ARM — Intel, aware of the threat, is working hard on driving down the power used by its Atom chips, though most efforts in this area are targeted at mobile devices rather than servers. Facebook thinks ARM adoption is going to start in storage equipment, then broaden to servers.

"I really do think it's going to have a dramatic impact on the amount of useful work, per dollar, you can get done," Frankovsky says. This should help cloud providers, such as Amazon Web Services, cut their electricity bills. Moreover, if they are caught in a price war with competitors, they are more likely to pass on at least a chunk of the savings to developers, in the form of price reductions.

6. Faster interconnects

The twinned needs of massively distributed applications and a rise in the core count of high-end processors will converge to...

...bring super-fast interconnects into the datacentre.

Joseph Reger, chief technology officer of Fujitsu Technology Solutions, predicts that by 2020 we can expect communications in the datacentre to be "running at a speed in the low hundreds of gigabits per second".

Reger says he expects that there will be a "very rapid commodification" of high-end interconnect technologies, leading to a very cheap, very high-performance interconnect. This will let information be passed around datacentres at a greater rate than before, and at a lower cost, letting companies create larger applications that circulate more data through their hardware (known in the industry as 'chatty' apps), potentially allowing developers to build more intelligent, automated and complex programs.

7. Datacentres become ecosystems

Cloud datacentres will "become much like a breathing and living organism with different states", Reger says. The twinned technologies of abstracted software and commodified hardware should combine to make datacentres function much more like ecosystems, with an over-arching system ruling equipment via software, with hardware controlled from a single point, but growing and shrinking according to workloads.

Datacentre
Cloud datacentres will "become much like a breathing and living organism with different states". Image credit: Jack Clark/ZDNet

Automation of basic tasks, such as patching and updating equipment, will mean the datacentre "will become more like a biological system" he says, in the sense that changes and corrections are automatically made.

8. Clouds consolidate

The internet rewards scale, and with the huge capital costs associated with running clouds, it seems likely that there will be a degree of consolidation in the cloud provider market.

Fierce competition between a few large providers could be a good thing, as it would still drive each of them to experiment with radical technologies. For example, in a bid to cut its internal networking costs and boost utilisation, Google has recently moved its entire internal network to the software-defined networking OpenFlow standard, which looks set to shake up the industry as more people adopt it.

Manley of HP argues there will be a variety of clouds that will be suited to specific purposes. "There's going to be diversity," he says. "I think you would only end up with a monopoly if there was an infrastructure around that was sufficiently capable to meet all the non-functional [infrastructure requirements] of those end services."

9. The generational shift

By 2020, a new generation of CIOs will have come into companies, and they will have been raised in a cloudy as-a-service world. There will be an expectation that things are available "as-a-service", Merrill says: "Our consumption model is changing as a generational issue."

And this new generation may lead to a shake-up in how businesses bill themselves for IT, Merrill says. "We have these archaic, tax-based, accounting-based rules that are prohibiting innovation," he adds.

10. Clouds will stratify

Today clouds are differentiated by whether they provide infrastructure-as-a-service, platform-as-a-service or software-as-a-service capabilities, but by 2020 more specialised clouds will have emerged.

According to Forrester, we can expect things like 'middle virtualisation tools' and 'dynamic BPO services' to appear by 2020, along with a host of other inelegant acronyms. In other words, along with some large providers offering basic technologies like storage and compute, there will also be a broad ecosystem of more specific cloud providers, allowing companies to shift workloads to the cloud that would otherwise be dealt with by very specific (and typically very expensive) on-premise applications.

Merrill says clouds will, like any utility, be differentiated by their infrastructure capabilities into a whole new set of classes. "Just as we have power generation from coal, from natural gas, nuclear, hydroelectric, there will be differences," he says. "The economics, in my opinion, help us with differentiation and categorisation."

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.
See All