Paid Content : This paid content was written and produced by RV Studios of Red Ventures' marketing unit in collaboration with the sponsor and is not part of ZDNET's Editorial Content.

The datacentre and ubiquitous storage in 2015 and beyond

Datacentres and cloud solutions are now well-established best practices, yet merely five years ago, it was not so.

ABI Research found that the installed base of active wireless connected devices will exceed 16 billion in 2014, about 20 percent more than in 2013. The number of devices will more than double from the current level, with 40.9 billion forecast for 2020.

Bain and Company analysts stated that globally, the IT industry would spend about $2 trillion on deployment and operations between then and 2015, unless smarter infrastructure radically simplified the management of virtualised environments.

In fact, research in 2011 indicated that if the technology in use at the time were to continue, 45 new coal power plants would be required across the United States to support IT infrastructure in 2015.

Cost and power efficiencies, along with other drivers such as lock-in avoidance, drove greater adoption of virtualisation technologies and advances of cloud technology.

The Telsyte Australian Infrastructure and Cloud Computing Market Study 2014 showed the total market value for public cloud infrastructure services to reach AU$650 million by 2018, up from AU$305 million in 2014.

We are now on the other side of these predictions, with virtualisation, cloud solutions, and products like OpenStack becoming mainstream, best-practice approaches.

The next big challenge between now and 2020 is expanding to broader demands for the future cloud. Intel's R&D vision for 2020 sees "big data" insights, deriving business, scientific, and social insight from global knowledge, and distributed computing -- specifically, enabling computing to move wherever it is needed via apps that span cloud, client, and edge -- are the two big demands the cloud will face.

Intel sees this being achieved by what it dubs as ubiquitous storage, taking the evolution of storage from original monolithic models through the current model of scale-up and scale-out storage to the next level.

Storage, despite sometimes being dubbed "snorage" by the press, is a big deal, and open and interoperable solutions are essential to deliver benefits of secure federated data shared among public and private clouds, to allow IT to focus on innovation rather than management, and to optimise services based on device capability.

Ubiquitous storage, Intel said, will offer greater modularity and flexibility, distributing data between public and private clouds, with automated tiering and scaled access, and supporting multi-tenancy on shared infrastructure.

Ubiquitous storage meets exponential demand, controls rising costs through efficiency and simpler management, increases data durability by 24/7 secure anywhere access, and supports growth and innovation through open standards.

In this future, storage will become a service, allowing developers to store and retrieve any amount of data, at any time, from anywhere on the web.

Of course, to support this, the network must continue to evolve, breaking bottlenecks and reducing latency. This will be coupled with data compression and data deduplication enhancements.

Editorial standards