X
Business

Why standardization is necessary

Paul Murphy believes writing to standards are good, while standardization - meaning choosing a particular implmentation over all others - is bad. Though standards may be very good, the limits of the standard creation process make standardization the only alternative.
Written by John Carroll, Contributor

Paul Murphy opined recently on the relative merits of standards versus standardization. To paraphrase, standards are good because multiple products do things in the same general way, thus making them replaceable for one another and thus more direct competitors. Standardization, however, is bad, because it favors one vendor or configuration to the exclusion of others, locking consumers into that vendor and limiting competition. From Murphy's blog post:

When an industry develops to standards, what you get is competition, buyer choice, and technical progress. In contrast, when an industry standardizes, what you get is higher costs, reduced competition, and the nickel and diming of customer value.

In truth, both standards and standardization aim for the same goal, which is to enable a consistent base for reuse.  The comparative merits of the two approaches, however, should be considered within the context of why companies and consumers frequently choose to standardize on one implementation. Frankly, standardization is often the only option given the limits of a universal technology standard.

1. Standardization lag: It takes time to understand a problem domain well enough to make a proper standard. Understanding usually requires experimentation, and thus cutting edge technology usually involves lots of entities (mostly corporate) competing with each other to do the best job of applying new technology to customer needs. This competition is essential to properly understanding the problem domain, meaning a proper standard can lag the first appearance of a technology by several years.

Likewise, even with understanding, the creation of a standard takes time much as 100 chefs working together on a lasagna takes time. Put 100 smart programmers in a room, and odds are they will give you 100 different ways of doing the same thing. A resolution requires debate, analysis, and compromise, and such things take time.

Customers are faced with a choice. They can opt to forego altogether use of a new technology until a proper standard is developed. That, however, is a catch-22, as how is anyone going to know what works if people are waiting until that mythical standard gets snared in a research trap. The result is often that standardization sets in so as to create a "de facto" standard centered around one company's implementation of a new technology.

This is why, in most cases, a new standard enters a market where one company has managed to gain a majority market share in technology covered by the new standard.

2. Standards are incomplete: As noted, it takes years before a problem domain is sufficiently well understood as to be a base upon which a standard can be built. Technology, however, does not stand still as humans fumble for understanding. This means that even if some aspect of a technology has been codified in a shared standard, newer aspects won't be, creating the same impulse to standardize on a de facto implementation as exists in item 1.

Furthermore, there is simply too much to standardize. Ignoring for the moment the churn created by new technology, imagine trying to standardize every interface in an operating system, every API in a software application, every document format, and every network protocol. How would we even gather enough information to manage that?

Likewise, what if there is disagreement? We already have various standards for a given technology, such as next generation web forms, or even standard window managers for Linux. On such a herculanean all-encompassing standard-setting project, I should expect wild disagreement to be the norm, leading to multiple competing "standards" and creating the same situation we currently face today with non-standard technology.

3. Implementations vary: Standards are a wonderful thing when they actually work. For instance, take the mini-standard that exists on Windows in the form of COM. If I have a COM interface, and others implement it, a host object which uses that interface can mostly use objects irrespective of the company or individual who made them.

I say "mostly," because implementation details can be sufficiently different as to compromise replaceability between standards-compliant components. When I worked at Orange in Switzerland, we ran into an issue where our Java product worked with the Xerces XML parser, and not with the JAXP parser.

In theory, that shouldn't have been a problem. Both implemented the DOM XML standard, and thus both theoretically were replaceable for one another. Our guess as to the cause of the problem was that the JAXP parser did something different from a threading standpoint than Xerces. The net result, however, was that we required everyone to STANDARDIZE on Xerces.

Another example can be found in the Linux world. Linux has a level of consistency made possible by a common code base that exists in all distributions of the operating system. That does not mean, however, that one distribution of Linux is automatically replaceable with another. This is why most of the market has coalesced around RedHat in North America and Suse in Europe. Standardization, in other words, is the means by which consumers create economics of scale where standards provide an insufficient base.

4. Standards don't preclude extensions: The fact that my product adheres to a standard doesn't prevent me from adding extra features that are unique to my product. Often, these extras deal with cutting edge technology, as I explained in item 1. Other times, however, they are just something the designers thought was a better way to do the same thing. Opinions vary among programmers (a fact I'm sure isn't lost on ZDNet readers, or Talkback participants), which is why standards rarely are a trump card in battles over the "right" way to do something.

It's not just proprietary companies seeking advantage who do this. JBoss has heaps of features unique to its J2EE implementation, and Firefox has extras in its implementation of CSS. This isn't something to condemn. A standard ensures interoperability, but it is not supposed to put a muzzle on creativity and prevent programmers from building a better mousetrap.

Programmers often find these innovative features interesting. In order to generate economics of scale around these features, therefore, the decision is often made to standardize.


Standards are an essential part of modern software ecosystems. Without standard HTML, standard TCP/IP or standard HTTP, a global Internet would not exist.

In an ideal world, everything could and would be standardized. We do not live in that ideal world, and thus standards rarely manage to account for more than a small fraction of the computing surface area. Standardization is often the only way to achieve the goals of standards committees, which is why standardization is so common in the computing industry.

Editorial standards