X
Innovation

On-demand computing: What are the odds?

Sure, utility computing may gain significant momentum in the next five years. But will it be a pervasive model for enterprise computing? IBM is betting on it.
Written by Dan Farber, Inactive

Welcome to the new era "on-demand " computing. For those of you who missed the big proclamation, newly appointed IBM Chairman Samuel J. Palmisano said last month that he is betting $10 billion that customers will turn to Big Blue to deliver computing resources the way a power utility doles out electricity.

Palmisano called IBM's move into utility computing a bold step, but also a low risk initiative. Bold and low risk, however, don't usually go together in the same phrase. When Neil Armstrong stepped on the moon in 1969, that was a bold step and high risk.

IBM's move into on-demand computing is a natural evolution of the current initiatives rather than the introduction of a disruptive technology or theme. It's the second generation of the Internet e-business initiative that started in 1996. In this instance, IBM technology favorites--including Web services, open standards, grid computing, and self-healing systems--have been unified and anointed as the company's next major, and marketable, strategic initiative.

Irving Wladawsky-Berger, the vice-president of technology and strategy who is leading IBM's on-demand effort, told me that the company will spend not just $10 billion to deliver on-demand computing, but an estimated $10 billion per year over the next several years. That figure includes a significant marketing and sales education budget as well as acquisitions, R&D, build outs of hosting facilities, and on-demand design centers to let customers test out the new concepts.

On-demand computing is also an extension of the successful bet outgoing Chairman Lou Gerstner made in 1996 to invest in IBM Global Services. From 1992 to 2001, Global Services grew from $7.4 billion to $30 billion in revenue, and today accounts for nearly half of the workforce.

IBM doubled-down on the consulting services bet with the recent acquisition of PwC Consulting for $3.5 billion. As a result, IBM has a huge trained army of business and IT consultants ready to march down the integration and utility computing path. Their motto: Leave all the technology to us.

In theory, the utility computing concept is a good idea, and will work best if several competing utilities and brokers are vying for your business. You could pipe in CPU cycles, storage, bandwidth and even applications from a shared pool of resources based on demand and pay only for what you use. You could create a virtual data center by mixing and matching on-premises and remotely hosted servers, storage, and applications.

Think about how much more attention you could give to your business if you weren't dealing with the complexities, vulnerabilities, and vagaries of networks, servers and applications. Of course, that scenario assumes the IT and business processes supplied to your company by a utility actually lowers costs, increases efficiency and produces more customer loyalty and profits.

And, imagine how happy IBM and its shareholders would be if you left the technology to them. It would be the return of the mainframe glory days, but this time in a distributed computing environment, designed and implemented by IBM's consultants and fashioned out of commodity parts with a liberal dose of IBM products like WebSphere, if possible.

IBM is certainly not alone in understanding the virtues of becoming a computing utility. Sun Microsystems has been talking about utility, or Webtone, computing for years. In the words of Sun CEO Scott McNealy: "You've got all of this legacy stuff that wants to connect to this big Webtone jukebox. And this jukebox is capable of playing on all these devices--any device with a digital or electrical heartbeat. This is what we call Services on Demand."

Hewlett-Packard is also looking to carve out a share of the utility computing market with its Planetary Scale Computing initiative. At first I thought it had something to do with interplanetary communications, but it's similar in concept to what IBM and Sun are promoting. At least IBM has avoided colorful descriptors like Webtone and Planetary Computing, which can scare off even a seasoned IT executive.

Down the road, it will be interesting to see if the utility players can really become interchangeable as suppliers of distributed computing services. It would certainly be unique, and goes against previous history in which open often becomes just proprietary enough to lock in a customer or protect a revenue stream. I can easily predict that customers will continue to be caught in the crossfire as vendors look for an edge and try to work around the assumed inviolate nature of open standards.

On-demand computing underpinnings
The underpinnings of utility computing have been most clearly articulated and packaged by IBM so far. Web services standards provide the interoperability among applications, and grid computing enables distributed computing resources to be shared and managed as a single, virtual computer. Bandwidth provides a conduit for all kinds of data and the profusion of message passing.

Autonomic computing plays a key role in dealing with the complexities and costs of managing complex, distributed networks. Loosely defined as systems that can be self-managed and responsive to a variety of conditions, autonomic computing is already a part of IBM's latest marketing push. This week the company talked up the self-management features of its PC as a market differentiator.

IBM's Wladawsky-Berger pointed out that companies are already experimenting with elements of utility computing and selectively outsourcing a portion of business processes and computing. Several vendors already offer utility-based pricing for server and storage capacity. Web services and grid protocols are gaining credibility, even at a time when technology spending is mostly confined to the essentials.

"Evenutally we will get there," Wladawsky-Berger said. "But right now it's an Alice in Wonderland situation and we are running like crazy to keep up with growing complexity and new initiatives like wireless."

The big question is when these utility computing products and services will become a mainstream phenomenon. Is utility computing a nice whiteboard diagram or the real deal?

According to Wladawksy-Berger, "The technology is at a point where we can start to move into an era of on-demand computing. I give it between two and four years to reach a level of maturity."

I would give Wladawsky-Berger good odds that by 2007 on-demand computing will gain significant momentum. But will it be a pervasive model for enterprise computing? That depends on a number of variables, including the broad adoption of the core technologies underlying the initiative in the next four years. In addition, IT executives who have built fiefdoms will be challenged to make cultural changes in shifting more of the power and personnel to a utility company.

At the end of the day, it boils down to relatively simple calculations. If you aren't spending millions up-front to invest in IT infrastructure and staff, a utility must offer a clear and advantageous alternative month to month and year over year. Providing server and storage capacity on demand is the easy part.

Betting that a utility can manage data centers, supply chains, business intelligence, and other critical business processes effectively and at a significant cost savings is far more risky. If on-demand computing can solve those kinds of problems, it may turn out to be a bold step forward.

What do you think: Is utility computing a whiteboard diagram or the real deal? TalkBack below or e-mail me at dan.farber@cnet.com.

Editorial standards