Why standardization is necessary

Why standardization is necessary

Summary: Paul Murphy believes writing to standards are good, while standardization - meaning choosing a particular implmentation over all others - is bad. Though standards may be very good, the limits of the standard creation process make standardization the only alternative.

SHARE:
TOPICS: Software
25

Paul Murphy opined recently on the relative merits of standards versus standardization. To paraphrase, standards are good because multiple products do things in the same general way, thus making them replaceable for one another and thus more direct competitors. Standardization, however, is bad, because it favors one vendor or configuration to the exclusion of others, locking consumers into that vendor and limiting competition. From Murphy's blog post:

When an industry develops to standards, what you get is competition, buyer choice, and technical progress. In contrast, when an industry standardizes, what you get is higher costs, reduced competition, and the nickel and diming of customer value.

In truth, both standards and standardization aim for the same goal, which is to enable a consistent base for reuse.  The comparative merits of the two approaches, however, should be considered within the context of why companies and consumers frequently choose to standardize on one implementation. Frankly, standardization is often the only option given the limits of a universal technology standard.

1. Standardization lag: It takes time to understand a problem domain well enough to make a proper standard. Understanding usually requires experimentation, and thus cutting edge technology usually involves lots of entities (mostly corporate) competing with each other to do the best job of applying new technology to customer needs. This competition is essential to properly understanding the problem domain, meaning a proper standard can lag the first appearance of a technology by several years.

Likewise, even with understanding, the creation of a standard takes time much as 100 chefs working together on a lasagna takes time. Put 100 smart programmers in a room, and odds are they will give you 100 different ways of doing the same thing. A resolution requires debate, analysis, and compromise, and such things take time.

Customers are faced with a choice. They can opt to forego altogether use of a new technology until a proper standard is developed. That, however, is a catch-22, as how is anyone going to know what works if people are waiting until that mythical standard gets snared in a research trap. The result is often that standardization sets in so as to create a "de facto" standard centered around one company's implementation of a new technology.

This is why, in most cases, a new standard enters a market where one company has managed to gain a majority market share in technology covered by the new standard.

2. Standards are incomplete: As noted, it takes years before a problem domain is sufficiently well understood as to be a base upon which a standard can be built. Technology, however, does not stand still as humans fumble for understanding. This means that even if some aspect of a technology has been codified in a shared standard, newer aspects won't be, creating the same impulse to standardize on a de facto implementation as exists in item 1.

Furthermore, there is simply too much to standardize. Ignoring for the moment the churn created by new technology, imagine trying to standardize every interface in an operating system, every API in a software application, every document format, and every network protocol. How would we even gather enough information to manage that?

Likewise, what if there is disagreement? We already have various standards for a given technology, such as next generation web forms, or even standard window managers for Linux. On such a herculanean all-encompassing standard-setting project, I should expect wild disagreement to be the norm, leading to multiple competing "standards" and creating the same situation we currently face today with non-standard technology.

3. Implementations vary: Standards are a wonderful thing when they actually work. For instance, take the mini-standard that exists on Windows in the form of COM. If I have a COM interface, and others implement it, a host object which uses that interface can mostly use objects irrespective of the company or individual who made them.

I say "mostly," because implementation details can be sufficiently different as to compromise replaceability between standards-compliant components. When I worked at Orange in Switzerland, we ran into an issue where our Java product worked with the Xerces XML parser, and not with the JAXP parser.

In theory, that shouldn't have been a problem. Both implemented the DOM XML standard, and thus both theoretically were replaceable for one another. Our guess as to the cause of the problem was that the JAXP parser did something different from a threading standpoint than Xerces. The net result, however, was that we required everyone to STANDARDIZE on Xerces.

Another example can be found in the Linux world. Linux has a level of consistency made possible by a common code base that exists in all distributions of the operating system. That does not mean, however, that one distribution of Linux is automatically replaceable with another. This is why most of the market has coalesced around RedHat in North America and Suse in Europe. Standardization, in other words, is the means by which consumers create economics of scale where standards provide an insufficient base.

4. Standards don't preclude extensions: The fact that my product adheres to a standard doesn't prevent me from adding extra features that are unique to my product. Often, these extras deal with cutting edge technology, as I explained in item 1. Other times, however, they are just something the designers thought was a better way to do the same thing. Opinions vary among programmers (a fact I'm sure isn't lost on ZDNet readers, or Talkback participants), which is why standards rarely are a trump card in battles over the "right" way to do something.

It's not just proprietary companies seeking advantage who do this. JBoss has heaps of features unique to its J2EE implementation, and Firefox has extras in its implementation of CSS. This isn't something to condemn. A standard ensures interoperability, but it is not supposed to put a muzzle on creativity and prevent programmers from building a better mousetrap.

Programmers often find these innovative features interesting. In order to generate economics of scale around these features, therefore, the decision is often made to standardize.


Standards are an essential part of modern software ecosystems. Without standard HTML, standard TCP/IP or standard HTTP, a global Internet would not exist.

In an ideal world, everything could and would be standardized. We do not live in that ideal world, and thus standards rarely manage to account for more than a small fraction of the computing surface area. Standardization is often the only way to achieve the goals of standards committees, which is why standardization is so common in the computing industry.

Topic: Software

John Carroll

About John Carroll

John Carroll has delivered his opinion on ZDNet since the last millennium. Since May 2008, he is no longer a Microsoft employee. He is currently working at a unified messaging-related startup.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

25 comments
Log in or register to join the discussion
  • Standards first

    A few items. While providing functionality above and beyond a standard is nice, the main complaint is that many people are picking and choosing what parts of a standard to implement instead of tackling the entire standard before implementing extras.

    You Xerces versus JAXP example is a bit strange. First off, it is not the DOM standard that is to be blamed for this. The DOM standard is what allows different programs (in this case XML parsers) to interface with XML documents. This doesn't control all the methods that Xerces and JAXP happen to expose. Either they aren't really standards compliant or the function that you were using wasn't in the standard.

    The main point of standards versus standardization is competition/replaceability. If everyone is working toward the standard then more tool-kits (dare I say "ecosystem") will be available for use. Instead, if you standardize on one tool-kit's implementation then you're stuck.
    Robert Crocker
    • Re: standards

      [i]While providing functionality above and beyond a standard is nice, the main complaint is that many people are picking and choosing what parts of a standard to implement instead of tackling the entire standard before implementing extras.[/i]

      But as I said, that's partly because a standard usually arrives long after a de facto standard has been created because the industry doesn't feel like waiting for the standard process to complete. So, what happens is the new standard gets reverse engineered onto the existing technology.

      If a standard is suffciently compelling, however (and CSS is, as was TCP/IP), the market usually forces them to implement it.

      [i]First off, it is not the DOM standard that is to be blamed for this. The DOM standard is what allows different programs (in this case XML parsers) to interface with XML documents. This doesn't control all the methods that Xerces and JAXP happen to expose. Either they aren't really standards compliant or the function that you were using wasn't in the standard.[/i]

      Maybe I wasn't completely clear on this part. I'm not blaming the standard. The standard did just what it claimed to do, which was create a standard interface protocol for speaking to XML parsers. Rather, the problem was in the IMPLEMENTATION of that standard, and we were NOT using any methods not part of the DOM standard. Like I said, we thought it had something to do with different threading approaches. I can't remember why we thought that, but it had something to do with the exceptions we were getting and the nature of the blockage.

      A threading approach is not often covered by the details of a standard, which is why I noted that implementation differences can be a goad towards standardizing on a particular implementation.

      [i]The main point of standards versus standardization is competition/replaceability. If everyone is working toward the standard then more tool-kits (dare I say "ecosystem") will be available for use. Instead, if you standardize on one tool-kit's implementation then you're stuck.[/i]

      Yes, and like I said, that's almost impossible in practice, due to standardization lag, standard icompleteness, implementation variance and the presence of "extras." Standards rarely cover more than a small percentage of the computing space, and frankly, that's all they will ever manage.
      John Carroll
      • Define Standardization...

        If by that you mean the process of making into a standard or incorporating into a standard a "technology" (whatever) that has been validated by prior use, then I agree.
        All other uses of that term are in my opinion devoid of meaning (a standardisation process does imply that somewhere down the line you do get a standard do you think? I really can't see how you or Murphy can oppose these two notions)

        This process is precisely how most standard committees work... Consider the C++ ISO group: they actually encourage vendor to provide extensions to the standard in their implementations. If an extension is adopted by a significant number of developers it will be reviewed by the committee (comparing them with competing and conflicting ones) and eventually added to standard C++. Of course if you implement extensions without taking the pain to be reasonably standard compliant first (like Microsoft use to do with its C++ compilers) you are going to be criticised.

        What's the benefit of having a standard if it is only there to validate independent innovations then? It is a strong centrifuge force of course, not only when a new version is decided upon but all along the discussion process (for instance in the ISO instances with TRs). It encourages implementers to discuss different solutions between them so as to convince each other. Generally it ensures that the best solution actually has a shot, not just the first one hitting the public, or the one with the biggest vendor backing it. And it gives innovations vastly more visibility and usefulness to developers by assuring there will be a push to implement them over all platforms.
        Look at what it has done for MS VC++ I don't think many people will argue than VC 8.0 is a serious progress over its predecessors, as was 7.1 and 7.0 before it, and the main trend there is definitely better standard compliance. It seem that your employer has finally gotten it by the way (at least part of that gigantic body that is Microsoft) hiring Sutter in the first place was probably the first sign of a change in direction.

        Of course there are cases when such processes don?t work, because the main actors are dumb. JavaScript is such a case. It was going nowhere very fast under the "joint" stewardship of MS and NS. As thing are now it is barely usable thanks to W3C efforts and Netscape late turnaround. And last I looked IE's support for CSS 2.0 was nothing to brag about (far, far inferior to Mozilla's), despite being a standard that MS originally pushed.
        This kind of story is why standard must be by definition put in the hand of "neutral" (or rather balanced) committees, else they cease to exist. When you have more than one implementation discussion and arbitration become necessary. This doesn't stifle innovation; on the contrary it ensures that it will be possible to leverage it.

        Had a standard Java been made, instead of the vendor proprietary language we have now, it would have been made a first class citizen of the CLI platform, as C++ is about to be, you wouldn?t have thought that a bad thing, would you?

        As for your story with the XML Parser, I profess total incompetence but I can see how it constitutes an argument for thorough standards. Your problems seem to have come from the fact that the standard ignored multithreaded environment, as does the C++ standard by the way. I am not sure that strictly speaking this directly concerns a DOM standard, as already mentioned it has more to do with the API no? But there is anyway a specific problem there: Depending on your definition of it thread safety can invalidate certain technical designs (such as copy on write patterns) (the other way to look at it is to ask the question "when do I need to lock??). As such deciding on a design that gives the user maximum flexibility is difficult.

        But how can you describe what you did as "standardization"? that would require, at the very least, that you could describe your choice by something else than the name of the vendor, that you could describe the technical option you decided upon, so that an other complying implementation could be made than that of the vendor you originally favoured. Otherwise I fail to see where there is anything "standard" there. It's just you picking a library.
        pphant
        • Edit...

          'As such deciding on a design that gives the user maximum flexibility is difficult.'
          rather:
          'As such deciding on a design that gives the implementer maximum flexibility is difficult.'
          pphant
        • Err... Edit 2

          should read:
          somewhere down the line you do get a standard _don't_ you think?

          I don't think many people will argue than VC 8.0 _isn't_ a serious progress

          Sorry writing long message in English is a demanding exercise for me... I lost focus.
          pphant
  • Standards. Some way to soon.

    Great article. I think many of the standards are absolutely neccessary, HTML, TCP/IP, SMTP, CSS, XML, Ecmascript, and HTTP to name a few.

    I run into a problem with those who claim that extensions for products should not be made or that there has to be a standard for everything that has anything to do with the web or any kind of internet communication.

    The fact is that in many areas, it is too soon to develop hard and fast standards and often it is not clear if the standard is being developed for the good of the web or the good of the companies pushing for the standard. (I think ODF is perfect case in point)

    I've also heard some claim that products like Flash should not be on the market because it clashes with the W3C SVG standard. I love the democatic nature of the web, but this smacks of imperialism.

    If it is the case that Flash should not be on the market, then maybe it is the case that it was too soon for the W3C to working on a graphic standard?
    bob2cam
    • ODF

      [i]The fact is that in many areas, it is too soon to develop hard and fast standards and often it is not clear if the standard is being developed for the good of the web or the good of the companies pushing for the standard. (I think ODF is perfect case in point)[/i]

      I'm curious what your problem is with Boeing or the Society for Biblical Literature pushing ODF.

      [b]Of course[/b] they're doing it for their own good. That's how it works in the grown-up world: people cooperate and compromise to get results that are on balance better for everyone.
      Yagotta B. Kidding
  • Compliance

    Of course, the best thing about "standardization" is that the chosen vendor can never be out of compliance. No matter how much the interfaces thrash between releases, the costs are all pushed off on the downstream parties.

    That's why "standardized" products are never well-documented. If they were well-documented, not only would the documentation move the market towards a standard (the specification) but the controlling vendor would lose control to their own child: the customers would start requiring compliance with the "standard."
    Yagotta B. Kidding
    • I don't agree

      [i]That's why "standardized" products are never well-documented. If they were well-documented, not only would the documentation move the market towards a standard (the specification) but the controlling vendor would lose control to their own child: the customers would start requiring compliance with the "standard."[/i]

      Not true. Take Microsoft.

      If Microsoft didn't document its "extensions" to HTML / CSS, developers couldn't use them. Those things are extensively documented.

      The barrier to implementing them is more philosophical than technical.

      [i]Of course, the best thing about "standardization" is that the chosen vendor can never be out of compliance. No matter how much the interfaces thrash between releases, the costs are all pushed off on the downstream parties.[/i]

      You exaggerate the thrash. They can change to their hearts content the "non-public" stuff that third parties aren't basing software upon. Microsoft (again, as a good example), however, can ill afford API thrash between releases, because that would hurt the millions of people who rely on the old API format.

      You can accuse Microsoft of many things, but one thing you generally can't accuse them of is abandoning old APIs. Witness the lifespan of the Win9x core and support for the kinds of applications that assumed the presence of the ideosyncracies of that enviroment. Likewise, if you use COM, you will find many interfaces with the same name but a 2, 3, etc. tacked on the end. Why? Because Microsoft wants to maintain the old interface, because third party developers rely on it.

      Microsoft ADDS to the API set between versions, but they rarely break the old stuff. Breaking old stuff is more the approach taken by Apple.
      John Carroll
      • Don't bring up APIs

        M$ always has 2 sets of APIs - optimized ones for internal use only (i.e. secret), and slow-ass crap APIs for everyone else. Whenever someone is complaining about #1, M$ ALWAYS answers by using #2. "What secret APIs (#1)? They are ALL published (#2)". Smacks of the Big Lie - just keep telling it, people will eventually believe it.
        Roger Ramjet
        • Has anyone proved that?

          I have no doubt there are undocumented API's within Windows, but a full 2 sets? Or even enough of a secondary set to put anyone at an overall disadvantage?
          I can't remember who said it, but to paraphrase
          'It is possable for three people to keep a secret, assuming 2 of them are dead'
          MS has lots and lots of people working for it. Were this the case, everyone would know.
          mdemuth
          • Benjamin Franklin

            [i]I can't remember who said it, but to paraphrase
            'It is possable for three people to keep a secret, assuming 2 of them are dead'[/i]

            Good old Ben.

            [i]I have no doubt there are undocumented API's within Windows, but a full 2 sets? Or even enough of a secondary set to put anyone at an overall disadvantage?[/i]

            At least back in the early 90s, certainly. "Undocumented Windows" tracks down quite a few, but they mutate with every revision -- that makes it extremely hazardous for anyone but Microsoft to use them.

            As for advantage, one example was with the early MSwin version of WordPerfect. It got slammed in reviews for being painfully slow to load and save documents compared to MSWord. It later turned out that MSWord used an undocumented file-access call that was blazingly faster than the official one.

            WordPerfect eventually reverse-engineered what MS was doing and the performance of their next version was on par with MS, but of course each time around the block cost them market share.

            Much of this was documented in Congressional and FTC hearings.
            Yagotta B. Kidding
      • Breakage, YEAH!!!!

        "Microsoft ADDS to the API set between versions, but they rarely break the old stuff. Breaking old stuff is more the approach taken by Apple."

        This is why the almost unfraggable HPFS that was included in Windows OS/2 NT was replaced by the highly fraggable NTFS. That this made Microsoft NT applications unusable on IBM's OS/2 of course had nothing to do with it.
        Update victim
        • What on Earth...

          ...are you trying to prove by comparing things as unrelated as that? Caroll was mentioning API, not file systems.
          Anti_Zealot
  • Thankfully there are many standards to swim in

    From a developer's perspective, standards and standardization can be a help or a hinderance. I've told the story before about my Unix programming experience (developing to POSIX in C). In short, it was nice for a while, but I grew to dislike it, because it felt like a straightjacket. Eventually I hit a wall on what it could do for me. There was a sense of "thus far, and no farther." Not that I couldn't implement what I needed within POSIX, but it's functionality never expanded to include functions that would've made my job easier. The job I was working required that I stick closely to POSIX, because we needed our software to be portable across multiple Unix platforms. At the same time, our clients had standardized on ANSI C. While some of my friends were using C++ on Unix at where they worked, I couldn't because our customers either didn't have C++ installed on their systems, or there was the fear that different C++ compilers lacked ANSI standard features. At least ANSI C was implemented the same everywhere.

    Even so, I ran into differences that had an impact on our ability to port our software to different Unixes. We always managed to do it, but whenever I deployed our software to a new system, debugging the system library differences was always something I had to do.

    The end result was there came a point where I was not progressing in my technical knowledge. I had learned everything I needed to develop the product and deploy it. But POSIX wasn't expanding anytime soon, and C++ compilers were not going to be 100% compliant with the ANSI standard anytime soon either. So I was hemmed in by both standards AND standardization. These in and of themselves are not bad. The problem was the standards and the standardized platforms I was dealing with were progressing at a snail's pace.

    Maybe the problem actually was the customer base my employer was dealing with, and I just needed a different employer with a different base.

    By the time I quit that job and started looking for work in 2000, almost every place I looked at for a new job was using C++, and Java was starting to become a factor as well. Because I had been stuck just doing C/Posix work at my former employer, I had no idea that the rest of the industry had gone beyond that, and I was behind the times. My existing language/API skillset was only nominally useful to anyone else. So I played catch-up and eventually found work again, but now I work on a de facto standard platform that is continually growing, so there is no upper limit that I've run into with the knowledge that I can obtain and still do something useful with it. I'm much happier in this environment.
    Mark Miller
    • Re: No standard remains a standard

      I have found the same problem with various programming languages, it doesn't matter what hardware, OS or platform, they all have limitations. Sooner or later somebody comes up with something new that lacks in areas where previous programming languages had no problems.
      Java was meant to be: Write on any platform and run on all.

      I have worked on: Ancient flavours of BASIC, VB6, ASP.NET, C#, C++ (DOS, Windows and Linux), Turbo Pascal, Delphi, Java, COBOL, DBase,

      Sometimes I still find some old 16-bit apps made for Windows 3.1 which can still do what they were supposed to do, but many have issues due to the major changes done in Windows within ten years or less. Nobody is going to have backwards compatibility in software which goes back a decade. If we are lucky, we get five years.

      Nowadays, a lot of programs don't cater for older hardware, which is due to companies being greedy. Notably seen in most of the recent games, the quality of the code is bad, the save game formats are in many cases not even compatible between minute version changes of the same game. This seems to be becoming a disease.
      bkernst
      • Twilight of the GNUs

        [i]Sometimes I still find some old 16-bit apps made for Windows 3.1 which can still do what they were supposed to do, but many have issues due to the major changes done in Windows within ten years or less. Nobody is going to have backwards compatibility in software which goes back a decade. If we are lucky, we get five years. [/i]

        Have you tried compiling any of the basic Unix utilities (grep, sed, etc.) from the 70s and early 80s or any of the C utilites in [i]Software Tools[/i] lately?

        It's been a bit since I did, but they all compiled and worked just fine the last time I did (about five years ago.)
        Yagotta B. Kidding
        • Yes simple command line utilities do

          Some C command line utilities that I wrote for 8 bit computers still compile today.

          Did you recompile a X application from the 80's?
          balsover
      • I don't entirely blame them

        Re: [i]Nowadays, a lot of programs don't cater for older hardware, which is due to companies being greedy.[/i]

        Well programmers tend to not exactly like working on older systems either. The newer stuff is "sexy". As technology progresses it gets easier to do stuff that used to take weeks to develop. Companies have the profit motive to "not look back" either, because all they have to look forward to with older platforms is a stagnant/shrinking customer base.

        But I don't entirely blame companies for looking out for their own profit margins. As we've all experienced, the IT market is volatile. Sometimes the bottom can just drop out of it. When that happens, if you're running a company, you'd better HOPE you made a lot of money while the getting was good, or else you're looking at bankruptcy!
        Mark Miller
    • Interesting anecdote...

      ...and demonstrates why the impulse to standardize exists as much for developers as it does for the average consumer.
      John Carroll