X
Business

Forces of nature

Answering sparkle farkle - with a detour through history that should be about 10,000 words long but isn't.
Written by Paul Murphy, Contributor

Last week's blog drew this from "sparkle farkle":

Convinced of my ignorance yet again..

Aside from the play by play, could you provide an example of how the DP method differs from Unix, as a theoretical. I don't understand how the user could be in more control by using a centrally controlled system where IT is involved (I have to assume re-writing or modifying) someone's program to suit, rather than installing (and letting the user configure) a program they already know. On the client side (the user) has a long road to re-learning or learning software in the first place.

I could cite a program like photoshop, and it's linux counterpart Gimp. there's no way the two are the same. (I will admit I'm getting better at using the Gimp, but just not the same in terms of ease of use and functionality) In a creative environment, there is not a real replacement for photoshop, or Autocad in it's many incarnations.

So you are left to virtualize, spend on licensing for the whole business etc. How could a centrally controlled entity make this any better?

Here's my summary response:

In brief: wintel, like DP, forces centralization of both processing and control. Unix allows (but doesn't force) people to separate these: centralizing processing while decentralizing control. Do that, and running IT (i.e. the gear etc) becomes a part time job, while IT people working inside user communities work for those users in making things (i.e. software) happen.

But, of course, there's more to it - specifically, things got to be as they are through the processes of history.

About the time data processing was getting started, organizational design people were enthralled by one Frederick Taylor, a popular exponent of "scientific management" whose view of the worker as an undifferentiated, and thus easily replaced, cog in the corporate machine:

Under our system a worker is told just what to do and how to do it. Any improvement upon the orders given him is fatal to his success.

applies reasonably well to the kind of unskilled labor he studied, but becomes increasingly counter-productive as both task and organizational complexity increase.

In his own context, directing highly repetitive activities like shoveling coal into furnaces, Taylor's ideas worked; but what really sold them into social prominence was the implied moral and social distance between those directing activity and those who carried it out. In the crudest terms: he sold management on the idea that the managerial class was so much smarter and more capable than the workers that they had both a right and a moral obligation to direct every detail of the worker's lives.

The reality, of course, is that Taylor over generalized from observations at the lowest end of the work complexity scale: basically the simpler and more organizationally isolated the task, the more applicable Taylor's liberal-fascism becomes - but the more complex and organizationally inter-linked a task gets, the more counter-productive attempts to apply it become. Thus Henry Ford usefully applied Taylor's ideas to individual work stations on the assembly line, but no five year economic plan produced by an economic dictatorship anywhere in the world has ever come anywhere close to reality.

You can see how Taylor's ideas were attractive to the men running Finance departments after world war I: they bought IBM's machines, hired clerks to execute the individual steps in data processing, and hired Taylorites to make sure that those clerks did their jobs, and nothing but their jobs, in wholly predictable ways optimized with respect to the most expensive resource: the machines.

Forty to fifty years later, people who made their bones in that system faced an organizational transition to virtual machines: from physical card sorters controlled by switch settings to card image sorting controlled by JCL - and just continued doing what they knew how to do.

The biggest external enablers for this continuity were cost and ignorance - the latter because then, as now, finance people simply didn't want to know what went on the black box labeled "data processing", and the former because cost continuity reinforces expectation continuity. Thus in the 1920s a line capable of end to end records processing for AR, AP, and GL cost about the equivalent of four hundred clerks hired for a year- and in 1964 so did the first 360 installations, while the typical $30 million zOS data center today is not far off that same 400+ full time equivalent cost.

In the 1920s that cost drove the focus on utilization, the role in Finance drove isolation and arrogance, and the combination of Taylorism with the after the fact nature of data processing both reinforced the other factors and enabled regimentation - both inside data processing and in its relationships with users. None of that has changed since: a DP manager magically transported from the 1920s could absorb the new terminology and carry on in most big data centers today without changing a single operational or behavioral assumption.

When Wintel started, things were very different: there was a booming personal computer industry, Sun was inventing the workstation, Apple was making the Lisa, science ran on BSD Unix, traditional research ideas about open source and data were widely established in academia, and thousands of large organizations were in the throes of conflict between traditional data processing management and user managers successfully wielding computing appliances from companies like Wang, Honeywell, DG, DEC, and many others to do things like document processing, job scheduling, and inventory management.

When the PC/AT came out user managers facing ever increasing corporate barriers to the purchase of appliance computing leveraged data processing's IBM loyalties to buy millions of them - only to then discover that there was little useful software for the things. That then created the markets and contradictions allowing Microsoft to succeed - and led directly to the 90s PC server population explosion with all its consequences for IT cost, security, and performance.

Those costs and consequent failures forced centralization: first of control and then of processing; until, today, most data processing wears a Windows face but is behaviorably indistinguishable from what it was in the 1920s - and the software, course, has evolved in parallel to make locking down the corporate PC to imitate a 327X terminal the least cost, lowest risk, approach to corporate "client-server" management.

Thus the bottom line on the merger of the DP and Wintel traditions in larger organizations is that any move away from centralized processing (whether implemented on zOS or wintel racks) adds both costs and failures while any move to decentralize control (letting users, for example, manage their own software) does the same.

None of this applies to the evolution of the Unix ideas: from the beginning science based computing has been about using the computer to extend, not limit and control, human communication and human abilities. Thus users are perceived, not as data sources for reports to the higher ups, but as members of a community of equals - and it's that perception of the role of the computer as a community knowledge repository and switch that ultimately drove the evolution of open source, large scale SMP Unix, and network displays like the NCD-X terminal then and the Sun Ray today.

There are both cost and control consequences to this for commercial use of Unix: the organizational data center that takes a zOS machine or several hundred PC servers to do the DP way, takes two or four SMP machines and costs an order of magnitude less to do with Unix. Neither the us vs them mentality from data processing nor the software functional differentiation that gets pushed all the back to the hardware in the Windows/DP world, exist in Unix - and open source ideas limit both licensing and development commitments. As a result processing centralization both minimizes system cost and maximizes the resources available to users without requiring control centralization.

It is possible (and common), of course, to be stupid: insisting on the right to run Unix in DP mode by doing things like restricting staff roles, tightly controlling user access, customizing licensed code, or paying for software to chop that big machine into many smaller ones. Thus the people running the organization I've been talking about over the last few weeks would, I'm sure, respond to an externally forced march to Unix by combining virtualization with both processor and job resource management to increase the negative impact of the limits and problems they face with Windows. Right now, for example, the 2,000 or so users who check their email between about 8:16 and 8:30 each morning completely stall out the 20 or so dedicated Exchange Servers and much of the network -and while none of this need happen with Unix, the unhappy reality is that everything these people know about running IT would lead them to spend money making things worse than they are now.

The point, of course, isn't that poor managers can't implement DP ideas with Unix, it's that good ones know they don't have to. The cost and risk forces that drove the adoption of DP ideas among wintel people simply don't apply: so giving IT staff posted within user groups the authority to act immediately on user requests falling within some global IT strategy offers significant corporate benefit without incurring the costs or risks this would entail with a wintel/DP architecture.

Note:

Sparkle farkle mentions two specific pieces of Wintel software, PhotoShop and AutoCad, as forcing wintel adoption. In some situations he'd be right: if, for example, you had 2000 users and 1900 of them routinely required autocad, then you'd probably find the Unix smart display architecture a poor solution - but if you have the more normal thing: 2000 users, 30 of whom routinely use autocad, then you need to remember that you're there to serve users, not to create and enforce computing standards - and so you give them what they need: a local wintel (or Mac, if it's photoshop) ecosystem all their own, complete with embedded support working directly for group management.

On the positive side, most of the costs of wintel, particularly those associated with staff regimentation, security, and software churn, rise super linearly with scale - so putting a bunch of "foreign" system islands into your Unix smart display architecture ultimately adds relatively little to the corporate IT bill - and remember too that users who only need occasional access to monopoly products like Autocad can be given that on their regular smart displays at no more than the same server and license cost the wintel people face.

And, finally, he also suggests that a Unix system requires a lot of code customization. This is not generally true: outside of research organizations most large Unix systems run unmodified commercial or open source applications - most original code does start on Unix (particularly Linux and MacOS X these days) but that's because it's a natural development environment. In non research use code development and customization expense is almost always associated with the Wintel/DP mentality and rarely found in Unix budgets put together by Unix people.

Editorial standards