All change?

Summary:Every year there are a number of high-profile IT events which capture the imagination. I tend to think that along with CES and Cebit, the Intel Developer Forum (IDF) is one of them.

Every year there are a number of high-profile IT events which capture the imagination. I tend to think that along with CES and Cebit, the Intel Developer Forum (IDF) is one of them. Clearly, it’s not as mainstream as others but you sometimes get a very good view of emerging dynamics that could change the industry for years to come.

In this respect, IDF 2011 didn’t disappoint. One of the things I found most interesting was the announcement that the Open Compute Project (OCP) and Open Data Center Alliance (ODCA) are set to collaborate to accelerate the development of cloud standards. Just to recap briefly, the ODCA is today comprised of over 300 companies who have come together to provide a unified customer vision for long-term data centre requirements. They aim to define usage models, influence industry collaboration and help create common standards for cloud technologies.

OCP is a Facebook-based initiative that aims to transform the energy efficiency of global data centers. Essentially, it aims to share the custom-engineered technology in its first dedicated data center in Prineville, Oregon. This technology delivered a 38 percent increase in energy efficiency at 24 percent lower cost for Facebook, and the specifications and best practices behind those gains are available to companies across the industry.

To my mind, the significance of this partnership is that end-users will be telling the vendors what they want. For example, ODCA members will have computing requirements fed to the OCP. Open Compute members will take on board these requirements and distill them into hardware designs which in turn will become part of ODCA usage models.

These hardware usage models will be available to other companies who can use them as they deploy particular kinds of infrastructure to support specific workloads. It’s an interesting approach because as well as having industry-defined usage models for IT infrastructure, we are now moving towards an industry-defined set of hardware requirements to implement these usage models.

In short, we’re going to have the establishment of detailed reference architectures for specific workloads. For example, we should get detailed specs about power consumption and cooling systems relating to the applications that are running. Clearly this has implications for hardware manufacturers because the typical ‘server box’ approach is going to lose value, when you have in effect bespoke architectures for specific usages.

At one level it makes absolute sense, however, at another it also challenges established practises. But that said, if there is one word that sums up the IT industry, its ‘innovation.’ And if this approach gains traction I’d expect the hardware manufacturers to continue innovating within this new framework.

Certainly the weight seems be shifting in favour of end-users and the OCP is playing a strong role in this. I ask myself is this a revolution that is possible or will we continue to see standard servers and IT building blocks as the way forward for cloud computing?

Topics: Cloud


I'm a multi-year Intel veteran, and currently hold the role of Strategic Marketing Director within EMEA. My time with Intel began with a role supporting all the PC design accounts in the UK - back in the days when the i286 was the latest and greatest processor on the Intel roadmap. Since then, I've moved through various techn... Full Bio

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.