Lots of people are touting universal software services: sign a contract, fire up a browser, and login to your service provider for ERP, Office automation, web sales, protein folding simulations, anything and everything by the hour, the minute, the transferred byte or whatever other measure appeals to you.
In the old days we called this time sharing, Time-sharing mission-critical services means you're using the same services, at the same cost, as your competitor.and it really worked pretty well for most customers most of the time. At that time, however, the most important problem this approach addressed was the high cost of technical infrastructure. In science-based computing, DEC's PDP series was king, but it wasn't until the VAX 750 came along in 1980 that it became possible to solve real engineering problems in a reasonable time on a sub million dollar machine.
Back in the '70s, software like IBM's STAIRS data manipulation package needed a multi-million dollar machine to run and came with enormous knowledge based barriers to successful internal implementation - but seemed safe and cheap when accessed via a time share contract with an IBM data processing center. Today that same situation exists for stuff like a salesforce.com tracking application -except now the hardware's cheap but risk, knowledge, and staff cost barriers dominate decision making.
Basically what happened to seventies time sharing was that hardware costs came down faster than people costs and project risks went up -thereby wiping out most time-sharing by the mid to late eighties. Since then, however, people costs and control related risk perceptions have filled the gap left by decreasing hardware and software costs; meaning that the combined barriers to internal implementation have been raised to the point that time sharing is marketable again.
From a business perspective, however, the problem has always been that time sharing mission critical services means you're using the same services, at the same cost, as your competitor. In other words it makes sense to do this if the competition is ahead of you, but not for the industry leaders because there's no competitive advantage to be had.
Bottom line: the whole software on demand gig will last until the perceived costs and risks of internal implementation for whatever the software is come down - and that's almost certainly going to happen across the board as open source and co-operative development ideas move into the business applications space.
There's a comparable argument, and result, on the consumer side. I was talking to a Sun guy the other day who held out the idea that consumers would eventually think nothing of passing big computing tasks to the Sun grid. Joe Average, for example, might prefer paying Sun a few dollars to convert an hour long high density video tape to stills rather than tie up his PC for weeks on the job.
Cool, except it isn't going to happen: first because the bandwidth isn't there, and more importantly because that nice Sony camera Joe uses will contain a Cell processor - and it will discover that Joe doesn't have a Cell PC and therefore feed what he does have stills as fast as its limited interface can take them.