X
Business

Why Flash and Silverlight will save the web

Flash and Silverlight are viewed in some circles as undermining a free and open Internet. The truth, however, is that both technologies are part of the process by which we map out a problem domain sufficiently, laying the groundwork for standardization committees of the future. Deviation from standards, in other words, is an essential component of standardization.
Written by John Carroll, Contributor

Usually, when I riff off another blogger, it's a blogger with whom I strongly disagree. Disagreeing is a heck of a lot easier to do (a principle to which ZDNet Talkbackers may relate) because it gives you more to talk about. I'd rather join a discussion composed mostly of anti-globalization, closed border Luddites than one composed of free trade technophiles, not because I agree with the former group, but because we'd have more to talk about than the latter.

I agree completely, however, with Paul Ellis, and I think I have something to add to his argument. Ellis, in a post titled "A Proprietary Web? Blame the W3C", discussed the issue of whether proprietary companies such as Adobe, with Flash, or Microsoft, with Silverlight, enhance the web or subvert it by popularizing technology that don't conform to free and open web standards. Ellis' argument was that its not the fault of Adobe and Microsoft for striking off in independent directions so much as the W3C, whose last major standardization effort (XHTML 1.1) was in 2001. Faced with a web that, in the words of Ellis, "has become increasingly stale for modern web development needs," Flash and Silverlight fill a void left in the development architecture most closely associated with the most common communications medium in existence.

I have trouble, however, blaming the W3C for lack of speed, as speed is not supposed to be a criterion applied to the work of standardization committees. Their real job is to assemble the accrued knowledge related to a problem domain, refine it through a distillation process whose aim is to make the technology logically consistent and reusable, and package it for general consumption through a system designed to popularize it.

That work is, by its very nature, slow-moving. Furthermore, it's also a process that of necessity lags the technological cutting edge by many, many years. A standard that tried to predict what technology will do in the future would be like historians writing theories about events that haven't happened yet.

Pushing the boundaries is not the job of a standardization committee. That is the job of private entities striving to figure out how best to apply new technology to existing customers. It is an exploration that runs in parallel, as it works best when multiple companies compete for dominance and market share using cutting-edge technology as the lure which attracts new customers. They make lots of mistakes along the way, but that's the point. You learn a problem domain by trying lots of different ideas, only some of which will prove useful and correct.

Only when a problem domain has been sufficiently explored and understood is it right for standardization committees to get involved to try to codify what has been learned.

It would make no more sense for the market to confine itself only to technology that is backed by a publicly ratified standard than it would for all of us to confine ourselves to 1850s technology in order to avoid the risk of dependence on new (and possible patented) innovation. I don't pretend companies are following altruistic motives when they create non-standard technology, but then again, I don't expect companies to act altruistically. There is a useful byproduct from their efforts, as they add to our collective knowledge of "the right way to do things" that will, eventually, inform a standard committee's work.

Editorial standards