Cloud computing: 10 ways it will change by 2020

Summary:What are the issues, challenges and technologies that will frustrate and inspire those working on the cloud in 2020?

Right now we are in the early days of cloud computing, with many organisations taking their first, tentative steps. But by 2020 cloud is going to be a major — and permanent — part of the enterprise computing infrastructure.

Eight years from now we are likely to see low-power processors crunching many workloads in the cloud, housed in highly automated datacentres and supporting massively federated, scalable software architecture.

Cloud 2020
What form will cloud computing take in the year 2020?

Analyst group Forrester expects the global cloud computing market will grow from $35bn (£22.5bn) in 2011 to around $150bn by 2020 as it becomes key to many organisations' IT infrastructures.

Alongside this increase in demand from enterprise, there will be development in the technologies that support clouds, with rapid increases in processing power making cloud projects even cheaper, while technologies currently limited to supercomputing will make it into the mainstream.

READ THIS:  Cloud 2020: What are the barriers to the cloud?

And of course, by 2020, a generational shift will have occurred in organisations that means a new generation of CIOs will be in charge who have grown up using cloud-based tools, making them far more willing to adopt cloud on an enterprise scale.

With all these developments in mind, here are 10 ways in which the cloud of 2020 will look radically different to the way it does today, as gleaned from the experts I've spoken to.

1. Software floats away from hardware

John Manley, director of HP's Automated Infrastructure Lab, argues that software will become divorced from hardware, with more and more technologies consumed as a service: "Cloud computing is the final means by which computing becomes invisible," he says.

As a result, by 2020, if you were to ask a CIO to draw a map of their infrastructure, they would not be able to, says David Merrill, chief economist of Hitachi Data Systems . "He will be able to say 'here are my partner providers'," he says, but he will not be able to draw a diagram of his infrastructure.

This will be because it will be in a "highly abstracted space", where software is written in such a way that it goes through several filters before it interacts with hardware. This means that front-end applications, or applications built on top of a platform-as-a-service, will be hardware agnostic.

2. Modular software

To take advantage of the huge armadas of hardware available via clouds, individual software applications are set to get larger and more complex as they are written to take advantage of scale.

With the growth in the size and complexity of individual programs, the software development process will place an emphasis on modular software — as in, large applications with components that can be modified without shutting down the program.

As a consequence, cloud applications will require a new programming mindset, especially as they interact with multiple clouds.

"Software has to be thought about differently," HP's Manley says, arguing that the management of federated services will be one of the main 2020 challenges. This is because applications are not only going to be based in the cloud, but will hook into other clouds and various on-premise applications as well.

In other words, different parts of applications will "float around" in and out of service providers. Assuring good service-level agreements for these complex software packages will be a challenge, Manley says.

3. Social software

Along with the modular shift, software could take on traits currently found in social-media applications like Facebook, says Merrill. Programs could form automatic, if fleeting, associations with bits of hardware and software according to their needs.

"It will be a social-media evolution," Merrill says. "You will have an infrastructure. It'll look like a cloud, but we will engineer these things so that a database will 'like' a server, [or] will 'like' a storage array."

In other words, the infrastructure and software of a datacentre will mould itself around the task required, rather than the other way around. Developers will no longer need to worry about provisioning storage, a server and a switch, Merrill says: all of this will happen automatically. 

4. Commodity hardware rules

By 2020 the transition to low-cost hardware will be in full swing as schemes such as the Open Compute Project find their way out of the datacentres of Facebook and Amazon Web Services and into facilities operated by other, smaller companies as well. "Servers and storage devices will look like replaceable sleds," says Frank Frankovsky, Facebook's VP of hardware design and supply chain, and chairman of the Open Compute Project .

"Cloud computing is the final means by which computing becomes invisible" — John Manley, HP

By breaking infrastructure down into its basic components, replacements and upgrades can be done quickly, he says. The companies best placed to use this form of commoditised infrastructure are large businesses that operate huge datacentres. "I would say that between now and 2020, the fastest-growing sector of the market is going to be cloud service providers," Frankovsky says.

5. Low-power processors and cheaper clouds

We're around a year away from low-power ARM chips coming to market with a 64-bit capability, and once that happens uptake should accelerate, as enterprise software will be developed for the RISC chips, allowing companies to use the power-thrifty processors in their datacentres, and thereby cut their electricity bills by an order of magnitude.

HP has created a pilot server platform — Redstone — as part of its Project Moonshot scheme to try to get ARM kit to its customers, while Dell has been selling custom ARM-based servers to huge cloud customers via its Data Center Solutions group for years.

By 2020 it's likely that low-power chips will be everywhere. And it won't just be ARM — Intel, aware of the threat, is working hard on driving down the power used by its Atom chips, though most efforts in this area are targeted at mobile devices rather than servers. Facebook thinks ARM adoption is going to start in storage equipment, then broaden to servers.

"I really do think it's going to have a dramatic impact on the amount of useful work, per dollar, you can get done," Frankovsky says. This should help cloud providers, such as Amazon Web Services, cut their electricity bills. Moreover, if they are caught in a price war with competitors, they are more likely to pass on at least a chunk of the savings to developers, in the form of price reductions.

6. Faster interconnects

The twinned needs of massively distributed applications and a rise in the core count of high-end processors will converge to...

Topics: Cloud, Amazon, Emerging Tech, Google

About

Jack Clark has spent the past three years writing about the technical and economic principles that are driving the shift to cloud computing. He's visited data centers on two continents, quizzed senior engineers from Google, Intel and Facebook on the technologies they work on and read more technical papers than you care to name on topics f... Full Bio

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.