SINGAPORE--Utility computing will stay in the realms of single-task applications that require raw compute power, at least for now, according to a senior official from Sun Microsystems.
Stephane Boisvert, Sun's senior vice president of global client solutions, said utility computing today is still largely restricted to tasks such as seismic analysis in the oil and gas industry, as well as large-scale data crunching in the financial sector. In utility computing, companies subscribe to computing services over the Internet much as they purchase electricity.
"When you get to the more complex ERP (enterprise resource planning) applications that respond to the business needs of an enterprise, it's difficult to create an ERP system based on utility computing to support those needs," Boisvert explained.
For example, he said, customer relationship management (CRM) products from the likes of Salesforce.com--where software is delivered as a utility service--cannot be customized to fit the unique needs of an organization.
"You have to take it as it is. You cannot ask Salesforce to redesign the software to fit your needs," he said.
The reason: delivering customized utility computing services to large numbers of users is a gargantuan effort, as it combines voluminous data, supercomputing power and business intelligence, Boisvert said. A profitable concoction of that scale is hard to achieve, for now, he said.
Moreover, he added, big data centers are needed to deliver high-levels of customization in utility computing services. "They also have to be energy-efficient, and in places like California, you are restricted in what you can do because of energy efficiency regulations," he noted.
On the recent outages experienced by Salesforce's customers, Boisvert said these companies will always be susceptible to downtime because service-level agreements (SLAs) are not typically included in their contracts with Salesforce.
According to a January 2006 report by research company Forrester, Salesforce.com has "yet to standardize an SLA that reimburses customers for unplanned downtime".
Boisvert said: "In utility computing at large organizations, discussions with customers [always] revolve around SLAs and downtime penalties."
Charles Giancarlo, chief technology officer of Cisco, said last year that utility computing would be a better fit for small and midsize businesses.
In large companies, the decision to host applications outside or inside of the network depends on many different factors, including cost and network efficiency, he added.
In Asia, security concerns and companies' preference to own their assets are common barriers to utility computing.
But Boisvert said companies can always opt to keep their own data in-house, and leave the data crunching to a utility service delivered over encrypted channels.
Sun already offers processing and storage in a pay-as-you-go arrangement of US$1 per CPU per hour, delivered via an Internet connection.