Clouds vs. Appliances

Although the evolution of cloud computing toward a kind of connective fog for handhelds and smart phones seems to be proceeding apace it's not remotely keeping pace with the rate at which major companies are proving unable to keep the large scale, public access, services needed for for major business applications to move to the cloud running smoothly.
Written by Paul Murphy, Contributor

When people talk about cloud computing as the future of corporate IT they're generally thinking more or less in terms of traditional applications coupled with some processor intensive work in areas like business analytics or numerical analysis. I don't think any of that's going to happen - instead I think that cloud computing will evolve mainly into services for smart phones and ultimately form a kind of connective fog: an extension of present internet usage providing the processing needed to accomodate mostly store and forward type functionality.

If the upside of traditionally conceived cloud computing is exactly that of 70s style time sharing - namely, that you neither own nor operate the infrastructure and can therefore largely bypass internal IT to get applications working on it - its nemisis is exactly the same too: cheap local computing.

That's what the PC promised and why so many corporate managers took advantage of data processing's eagerness to buy from IBM to order hundreds, and often thousands, of PC/ATs when those first came out - and it's also why many became even more cynical about IT when the first machines turned out to be laughably inadequate and later ones proved so unreliable that technologies brought in to free users from the data center are now owned by the data center and run by the data center.

Appliance computing, a model under which the customer buys a service delivered via a local processor local IT does not have root access to, had a brief resurgence in the ninties but neither the machines nor the software then available could handle larger, more complex, corporate applications without considerable on site expertise. At the time, therefore, that solution was largely limited to perceived simple and peripheral tasks of the kind we now associate with "purely hardware" solutions (like routing) or hosted services (like running Apache servers).

Today, however, those limits are largely gone and there are no big technical gotcha's facing a company that wants to sell its customers things like fully vendor installed and vendor monitored ERP/SCP applications running on supplied servers and desktop displays entirely on the customer's premises and physically in the customer's control.

To get these the customer would simply sign a usage agreement and provide appropriate space, power, and network access to the vendor. Once installed, users would treat the applications the way they do the telephone: as something that just works and is monitored externally rather than by local data processing people - and, of course, if the vendor went under, they would simply take possession of the equipment, invoke contractual rights to passwords and licenses kept in escrow, and have at least as much time to adapt as they would if traditional IT had brought the applications on board in the traditional way.

The beauty of this approach is that it gives user management willing to accept standardized software direct control of its own IT while reducing both cash costs and business risk - quite the opposite of what this rather inadvertent bottom line comment (from a March 2009 Financial Times report on a gmail shutdown) reveals about the essence of cloud computing:

The glitch that led to the first global shutdown of Gmail since August began on Tuesday, during routine maintenance.

There've been what? five so far this year?

Editorial standards