X
Tech

If Sun builds it, will you come?

Dan Farber: Crossing the chasm to utility computing requires an acknowledgement that lots of customization isn't needed, trust in the service provider, and trust that the pricing really is transparent.
Written by Dan Farber, Inactive
COMMENTARY -- For years, mostly vendors (not their customers) have been talking about utility computing. You flip a switch and you have all the compute, storage, and network capacity to power your applications over the Web. The analogy is the power grid--all the complexity is hidden. You simply put in your request for running a specific application at a particular service level (of course, at a fair market price) or ask for "X" amount of infrastructure and stop worrying.

IBM, HP and Sun have been pushing the infrastructure utility concept with very limited success for computationally intensive applications in the financial services, petroleum, life sciences and digital animation industries. They all preach about tapping into massive, resilient server farms, responding to peak demands, and pay-as-you-go, metered pricing that lowers operational costs. For example, why have your own server and storage farm for rendering digital animations or unfolding proteins if it's not your core competency and if they aren't fully utilized?

We know why most enterprises are somewhat reluctant -- enterprises are accustomed to managing their own infrastructure (and IT staff); they don't want their vital data exposed outside their firewall; they make do with their old ways of handling peak demand or compute-intensive applications; and they own literally tons of underutilized hardware.

Sun is making a big bet that enterprises will shift more to the variable cost model and metered pricing as a way to lower costs and to get out of the business of building increasingly more complicated data centers. The emerging Web services infrastructure is good enough that most enterprises don't need to reinvent the IT wheel. It's the latest incarnation of Sun's tagline: "The network is the computer."

As evidence of its bet on utility computing, Sun is making a huge investment, according to Aisling MacRunnels, vice president of utility computing at Sun, to build what are called Sun Grid Centers in New Jersey, Texas, West Virginia, Toronto, London and Scotland, each starting out with about 5,000 CPUs (Opteron and Sparc running Solaris). Sun also has 15,000 CPUs internally that could be utilized if needed. Grid technology is used to create a pool of virtualized resources, such as CPU cycles, that can be scheduled and allocated to jobs on the fly.

A key to the economics of any global grid is standardized infrastructure, which makes it easier to build and manage; and aggregating demand, with multiple "tenants" sharing infrastructure, each in their own secure "container." Another key, according to Sun President Jonathan Schwartz, is "transparent" pricing, which means that you can truly compare prices--like the price of gas or loaf of bread. The Sun Grid Compute utility is priced at $1 per CPU per hour, with a minimum four-hour contract. Customers don't have to pay for loading the data, storage or scrubbing the systems clean at the end of a job, although customers can choose to have their data storage in Sun's new Storage Grid utility, which costs $1 per gigabyte per month for the high end solution.

MacRunnels said that some financial institutions want to harness 5,000 CPUs 24 hours per day for three years to run compute-intensive applications such as Monte Carlo simulations for risk analysis. A handful of financial companies have booked most of the near-term capacity in the new grid centers, MacRunnels said. "They don't want to build [the infrastructure], and it's not their core competency. It's a huge business. By our calculation and [research firm] IDC's numbers, Wall Street will use about 10 billion CPU hours in 2006." Sun's logic is that if Wall Street adopts the utility concept, Main Street will follow.

Sun also plans to revive its "retail" grid, similar to the Compute Grid, in the next few months, which will allow customers more of a just-in-time, self-service approach, obtaining compute or storage services over the Web like you would order music on Amazon.

In addition, Sun is creating a Developer Grid utility, which would allow developers to build applications in the Sun environment and then test and deploy them. Sun will introduce an Application Grid in early summer for running internal applications, such as for oil exploration in the petroleum and gas industry, as well as popular enterprise software, if it can recruit the vendors to adapt their software for the grid. In the future, MacRunnels envisions a global grid for running desktop systems, but transactional grids are a more challenging proposition.

If utility computing is going to be a huge business sooner rather than later, Sun will feel the heat from other big infrastructure providers. Schwartz has said that Sun's most effective weapon against IBM in utility computing is transparent pricing. If that's the case, then Sun's transparent pricing doesn't seem competitive with IBM's, on first glance. IBM is offering its computing grid for half the price, about 47 cents per CPU per hour. IBM has three grid centers, in Poughkeepsie, New York; Houston; and in the Montpellier, France. Poughkeepsie has about 4,000 CPUs and Houston has 1,024 CPUs, primarily Xeon and Opteron clusters running Linux. The Montpellier facility has mostly pSeries servers running AIX in France, according to David Gelardi, vice president of Deep Capacity Computing On Demand at IBM. He said IBM has about 20 clients, and his unit typically sells several days to a few weeks of work, using anywhere from 128 to 1,028 CPUs. Landmark Graphics, for example, taps into the Deep Capacity grid to analyze seismic oil exploration data faster.

"If you look at either Poughkeepsie or Houston, given all of costs, we can sell profitably at about half of what Sun charges," said Gelardi. "Our maximum utilization would be 85- to 90 percent, but our targets are at the 50- to 75-percent utilization level. At that point, we can turn a profit." To date, Deep Capacity isn't a substantial business for IBM--it accounts for less than $10 million of the company's yearly $96.5 billion in revenue.

Either Sun's $1 per CPU per hour has a massive profit margin built in, or IBM's pricing is more of transparent pricing attack on Sun's effort to take a leadership role in bringing utility computing into the mainstream than service with a real profit margin. According to MacRunnels, Sun has a pricing advantage over competitors like IBM and HP because it owns a complete stack (no licensing fees). Sun's grid centers will also stay relatively fresh, replacing AMD-based systems every six to eight months, using the older systems for other functions or possibly selling them in the gray market.

MacRunnels said that to cross the chasm to utility computing, you have to get people to move to a paradigm of trust--trusting that they don't need lots of customization, that the service provider is trustworthy and that pricing is really transparent. Based on the early forays into the marketing of utility computing, transparent pricing needs to be more "transparent." Differentiation isn't based on whether the processor is Opteron or Xeon or Power. It's about the service experience and quality rendered, and being able to compare apples to apples. Without a clear view into how the different "utilities" arrive at their pricing, you'll be scratching your head trying to find a semblance of truth.

You can write to me at dan.farber@cnet.com. If you're looking for my commentaries on other IT topics, check out my blog Between the Lines.

Editorial standards