Getting enveloped by the potential of Cloud computing

By taking a fundamentally Web-based approach to the development of applications, we shift from bolting Web capabilities onto the silo toward a mode in which data and functionality are native to the Web. How do we change the mindset of today's application developers, in order that they stop building 'old' applications in the new world?


The orgy of present unwrapping at the end of last year feels a surprisingly long time ago, so it is with some enthusiasm that I await the arrival of my latest bundle from Amazon. Nestled alongside the new offerings from Seth Godin and Garr Reynolds, I look forward to lifting Nick Carr's latest much-trailed tome which the book's own website describes thus;

“The shift [to utility computing] is already remaking the computer industry, bringing new competitors like Google and to the fore and threatening stalwarts like Microsoft and Dell. But the effects will reach much further. Cheap, utility-supplied computing will ultimately change society as profoundly as cheap electricity did. We can already see the early effects — in the shift of control over media from institutions to individuals, in debates over the value of privacy, in the export of the jobs of knowledge workers, even in the growing concentration of wealth. As information utilities expand, the changes will only broaden, and their pace will only accelerate.”

I obviously haven't read the book yet, but I understand that Carr sees this trend as a worrying one. Stephen Baker's review in BusinessWeek notes, for example, that;

“Carr warns this trend could herald a new, darker phase for the Internet—one where these mega-networks could eventually operate as a fearsome entity that will dominate our lives. He dubs it 'the World Wide Computer'.”


“[Carr] describes a world in which a handful of lucky and brilliant entrepreneurs uses the World Wide Computer to tap humanity's smarts and creativity for free, à la YouTube and Wikipedia, while putting legions of information professionals out of work. If that's not dreary enough, he predicts that companies and governments will be able to harvest data from these networked computers to track our behavior and, ultimately, to control us.”

People probably said something similar as Ferranti, Edison, Tesla, Westinghouse et al championed the distribution of electricity from increasingly large and remote electricity generating facilities. Is the premise bad, or is it 'merely' a (potentially) painful transition that we face?

Writing in December's issue of the SemanticReport, I take a slightly different look at a related set of issues. In my article, 'Moving the Internet Inside with Semantic Technologies', I contend that;

“The vast majority of today’s enterprise applications owe their genesis to a period very different from today. Even the most apparently innovative share perhaps unnecessary heritage with their ancestors, preventing them from fully exploiting the potential of an ever-more connected world.

The increased ubiquity of high speed access to the Internet has changed the lives of millions. At the same time, plummeting costs for storage, computing and bandwidth have formed key aspects of the environmental shift that has enabled Web-based companies to entice users with significant free offerings, and to subsequently monetise these in a variety of ways from the ever-popular ‘pro’ account to the universal fall-back of advertising. The Web has become our water cooler, our photo album, our book shop, our encyclopaedia, our travel agent, our road atlas, and also fulfils a host of functions without commonplace offline forms.

Inside the enterprise, the Internet remains at a remove from the applications within which most employees spend their time. Valid concerns around security are certainly a factor here, as are the long lead times required to develop and implement new systems. More serious, though, is an apparent lack of vision. Rather than fundamentally re-engineering with what Tim O’Reilly refers to as ‘the Internet Inside,’ new applications continue to repeat the methods and mindset of the past. The capabilities of the network Cloud beyond the corporate firewall remain woefully underused, their potential unrealised.”

We have, of course, seen significant use of the network to connect the far-flung offices of an organisation, and to enable connectivity amongst collaborators. Banks move data amongst themselves all the time, and some of that traverses the Internet. Libraries around the world hurl Z39.50 queries and responses at one another all the time, and all sorts of other organisations leverage the network as a means to transfer information from one place to another. My point is not that businesses don't use the web. Rather, it's that they fail to embrace its potential. To consider the Internet as a means of point to point transmission is, surely, to miss the opportunity presented by a complex and multi-directional network of variously connected nodes. The Web isn't (just) a FedEx for bits and bytes. It also shouldn't be considered as merely a means of physically separating the components of our traditional monolithic applications, applications that whatever their geographic diffusion remain logically massive and discrete.

Technologies such as those being espoused for the Semantic Web enable us to take further steps, turning traditional application architectures on their heads and embracing the potential offered by a new generation of services that are conceived and constructed with the network at their heart.

By taking a fundamentally Web-based approach to the development of applications, we shift from bolting Web capabilities onto the silo toward a mode in which data and functionality are native to the Web: a mode in which the design decisions are more about modelling business requirements for limiting the ways in which data flows from one point to another rather than trying to anticipate the places in which it might be needed in order to design those pathways into software from the outset.

Data needn't reside within a single application, and as the trend toward 'widgets' and 'dashboards' has begun to illustrate, interaction with the application itself is no longer restricted to navigation within its own user interface.

How do we change the mindset of today's application developers, in order that they stop building 'old' applications in the new world?

In beginning to think about this post, my original premise was around 'embracing the cloud [computing]'. However, UK technology journalist Bill Thompson was quick to point out that

“if you embrace a cloud you fall to earth... maybe you need to be enveloped by it...”