X
Tech

Development environments: Microsoft vs. Open Source

In a previous discussion of OpenOffice versus Microsoft Office most respondents addressing the development issue seemed to prefer the Microsoft development tools over the comparable open source ones. I believe this is probably correct from the individual developer perspective, but wrong from the corporate perspective.
Written by Paul Murphy, Contributor
A few weeks ago we had a fascinating discussion here comparing the functionality available with the Staroffice/OpenOffice open source cluster to that available with Microsoft's Office set. The general conclusion, I thought, was that the two sets of technologies are roughly equivalent on office functionality but Microsoft wins easily in terms of development integration and related tools.

Today I'd like to start expanding that discussion into two related areas: one focused on risk, the other on developed application functionality.

Since I believe in the adversarial advocacy route to truth used in the judicial system, I'm going to take the position first that the applications delivery ideas underlying the architecture in use determine both the limits of its toolsets and the extent of the risk you accept by using them; and second, that risks are higher, and the limitations more constraining, for Microsoft's client-server than for the Unix/Smart display approach - where the latter is assumed to rely as much as possible on open source applications and development tools.

In today's opening round on this I want to challenge readers who use Microsoft's development tools in an Office or general business applications context to contest the following statement:

Anything you can do with Microsoft servers, clients, Office, and dot.net technologies I can do better [i.e. at some combination of greater functionality, lower risk, and lower cost] using the standard LAMP/SAMP toolset with emphasis on OpenOffice and the Apache cluster.

I see are three key differences between the two sets of technologies:

  1. the [LS]AMP technologies, particularly Cocoon, centralize processing, impose no client distinctions, and ultimately require storage of no more than one copy of the original data, document, or other material being worked with;

  2. the [LS]AMP technologies are open source and rely on open standards and technologies; and,

  3. the [LS]AMP technologies heavily front-load the learning curve: once you know how to properly use one to do anything, everything else you can do with it becomes relatively easy.

    (In fact Apache Cocoon's documentation turns strong men pale - coming to it from a Wintel BASIC background has to be a lot like getting off the couch for a trip to the fridge only to suddenly find yourself stranded naked on an iceberg and forced to scavenge for fish among hungry penguins. A sample, from the "welcoming" page for the Maven development block:

    The purpose of the block preparation goal is making a block runnable as web application and enabling rapid application development. It uses the Jakarta Commons JCI library which provides a reloading classloader. As the names already promises, it is able to watch resources (e.g. .class files) for changes. In such a case the classloader will use the latest version of the resources in the future.Some Java servlet containers already provide the automatic reload of loaded servlet contexts in the case of changes but this comes with the downside that you might lose the application state and that you are not allowed to change the signatures of classes and methods.

    By no coincidence the Cocoon technologies are both the most powerful and the hardest to learn - but on the plus side the stuff works in depth, and people coming to it from a Unix C/Java background typically have no problems seeing the values and ideas behind the jargon.

In the context of Office and general business applications the advantages offered by the open source approach should be obvious.

At the top of that list is risk reduction.

On the most superficial level: no client storage or processing means no client security exposure. Compare the risks of giving people access to confidential data on Sun Rays versus Wintel laptops, and the difference is obvious.

Go one step further: to secure document management, and every Wintel method both requires coercion and is subject to ifs, buts, and maybes - while doing it with Cocoon is straight forward enough that no photo of anyone's screen is going to make the front page of the New York Times simply because the photo itself can tell you who did it.

More interestingly, a key risk many companies forget about involves long term information access - all of the Microsoft proprietary formats, particularly those involving encryption, are subject to arbitrary change and therefore loss. I don't know a single long term Windows user who hasn't lost personal documents to Windows upgrades - and I know quite a few companies which have lost full or partial access to information - particularly with respect to formats, icons, signatures, and read/write time and change authorization records - due to third party, mostly Microsoft, initiated change entirely outside their control.

Open standards minimize those kinds of risks of loss -and most directly avoid such common remedial costs as those of loading, reformatting, and re-saving documents originally formatted with previous Word, Excel, or PowerPoint generations.

At the top of the risk actualization tree is the branch everyone developing and using Wintel applications falls off: millions of companies built for NT, rebuilt for 2000, revised for 2003/XP, now face rebuilds for the 2008 Server/Vista combination -and can look forward to throwing away any successes they achieve in this process just as soon as Microsoft gets whatever they want to sell next out the door.

In contrast, applications I worked on using Unix 4Gls like Accel with Unify 4.0 in the late 1980s have generally proven automatically upgradable to new technologies and most could be taken off those 60MB NCR and SunOS tapes, loaded on Solaris with PostGresSQL, automatically converted to Java, to run with no manual editing needed - and if you want anything you write using PHP/MySQL and Apache today to be usable ten years from now, all you need to do is stick to standards while squirelling away a copy of the source for everything involved - just in case.

Near the bottom of that list is hardware adaptability. Cocoon, in particular among Apache technologies, is an almost perfect application set for demonstrating the power of Sun's CMT/SMP technologies - and would take less work than most to adapt to IBM's cell processors.

Wintel, in contrast, is hamstrung by the lack of software adaptability to anything outside the x86 franchise - people porting Windows applications to Windows 2003 for Itanium have not, for example, much enjoyed the process or done much more than self-consciously defensive preening about the great successes obtained by trading down from cheap Xeons to multi-million dollar Itanics.

In between are other issues, many of which I hope to get to in the weeks ahead, and all of which seem, at least at first glance, to favor corporate investment in open source people over paying for proprietary licensing. Next Thursday, for example, I hope to explore the cost of training and retaining staff for both environments - but, until then, the bottom line is in the challenge - and for those of you want to prove me wrong, here it is again:

Anything you can do with Microsoft servers, clients, Office, and dot.net technologies I can do better [i.e. at some combination of greater functionality, lower risk, and lower cost] using the standard LAMP/SAMP toolset with emphasis on OpenOffice and the Apache cluster.

--------------------

A footnote: Microsoft's direct-X has nothing to do with X - and along that same line there's a Microsoft Cocoon project which appears to be just another government/Microsoft health care information exchange boondogle whose naming Microsoft seized on to confuse the market.

Ironically, however, it may actually be predicated on use of the open source Apache Cocoon software.

The project's "technology infrastructure" page consists entirely of a diagram with the word "Cocoon" in small vertical print on the XML piece and this cogent explanation:

The most important element on which COCOON will be based is its Infrastructure. Several technologies are involved into it because it will be the most important part of the project. On top of services offered by this Infrastructure, it will be built all the knowledge related services, collecting information from already existing peers, nodes, which are the real knowledge repositories, which integration is the COCOON target. The figure below gives the vision of how Smart Search Engines, K.M. Services, and Multi-channel delivery are connected to the Infrastructure and knowledge repositories.

So I don't know what they're doing (besides getting paid) - but if I wanted to do this kind of thing for real, I'd certainly use Cocoon to do it.

Editorial standards