From Chapter Four: The Unix and Open source Culture

Part two of the introduction to the Unix and open source cultural grouping.
Written by Paul Murphy, Contributor

This is the 38th excerpt from the second book in the Defen series: BIT: Business Information Technology: Foundations, Infrastructure, and Culture

About this grouping (2)

The underlying change drivers here are resource cost and scale -with the cost equation reversed for Unix relative to the System 360 and its descendents.

For example, the smallest mainframe, the z8X0, starts at about $250,000 before licensing and peripherals and has one usable CPU running at about 760MHZ. In contrast, a Dell 2650 with twin 3.2 GHZ Xeon processors running Unix against a 1TB external RAID array costs about $12,000 and handily outperforms the z800 on all measures - but people who only know mainframes still buy the z8X0.

You can use partitiioning and virtualization with Unix - but it isn't very smart
System partitioning and virtualization are mainframe solutions to mainframe cost and process management problems - but are now widely used with commercial Unix.System partitioning began as a way of splitting a multi-million dollar machine into two pieces - a large production piece and a small development piece - to enable data centers to operate both with only one multi-million dollar machine.

Virtualization began as a memory management method ensuring that one programmer's job couldn't interfere with another's.

Thus IBM's interactive mainframe OS: zVM; depends on virtualization to separate users while the underlying SP layer allows the z900 hardware to be partitioned to form up to 15 logical machines each of which can load zVM to run multiple virtual machines - one per user.

Companies like Sun and HP added both partitioning and virtualization to Unix because thousands of mainframe customers knew they needed them. As a result you can now get a 64 CPU Sun 20K with 512GB of fully symmetric memory for about $1.4 million and then partition it eight ways to run as eight machines with 64GB and eight CPUs each.

Of course if 64GB of RAM and eight CPUs suffices for your jobs, getting a rack of three Sun 1280s each with 12 CPUs and 96GB of RAM gets you the same structure with a third more hardware --and roughly three quarters of a million bucks in change on that 20K. Similarly the complexity of application interactions and the reliance on the reboot/reload debugging cycle has taught the Microsoft PC community to run one application per box. Today, with Linux taking over backend services in many organizations and businesses, Windows experts are replicating this structure with Linux -which doesn't have the registry related problems this responds to- by doing one to one conversions and nevertheless getting both performance improvements and cash savings over their Windows installations.

The bottom line on business use of Unix is that the system is so cheap and flexible that it generally produces net business benefits even when mis-used. Partitioning that Sun 20K destroys the value that comes from having access to 512GB of fully symmetric memory - but the result is still a third faster than the six million dollar base z900 for about 20% of the cost.

Another way to look at this is to see that, because Unix can be used by almost anybody to do almost anything in computing, there has been little business pressure to develop unique cultural attributes.

There is a unique Unix culture, but it developed and flourishes in the research, not business, community. In the research community cost and performance pressures on individuals combined with the general absence of cross cultural contamination to produce Unix best practices like peer review, networking, user empowerment, the open source movement, and direct end user control of systems decision making.

The myth of responsibility in software
Sun now sells StarOffice licenses instead of giving them away. Why? because many potential users insisted on the right to pay.In part this is the result of a myth: the notion that a software user should be able to hold a supplier responsible for software failures. In reality companies can't sue Microsoft because Word loses files, or SAP because their implementations failed. Read your Microsoft end user license agreement carefully - they're not accepting any liabilities - and neither does Sun when you license StarOffice.

This happens in other computer arenas too. Small consultants are replaced with international companies because the client likes the security of dealing with a company big enough to stand behind its work - but they never do, and smarter customers generally know it. Those management ideas have only recently started to appear in business uses of the technology as open source products have started to displace proprietary solutions.

In large part this is due to the head start other toolsets had in developing business oriented computer cultures. Large companies generally select systems managers using their experience in other large companies as a proxy for expertise, thereby failing to recognize that the skills needed to properly deploy a technology tend to reflect that technology. As a result most of the executives responsible for large Unix sites have mainframe, mini-computer, or Microsoft PC backgrounds that lead them to mis-manage Unix.

Some notes:

  1. These excerpts don't (usually) include footnotes and most illustrations have been dropped as simply too hard to insert correctly. (The wordpress html "editor" as used here enables a limited html subset and is implemented to force frustrations like the CPM line delimiters from MS-DOS).
  2. The feedback I'm looking for is what you guys do best: call me on mistakes, add thoughts/corrections on stuff I've missed or gotten wrong, and generally help make the thing better.Notice that getting the facts right is particularly important for BIT - and that the length of the thing plus the complexity of the terminology and ideas introduced suggest that any explanatory anecdotes anyone may want to contribute could be valuable.

Editorial standards