Last week reader leigh@ wrote:
OK I get the picture but...
When will, or how will we get an article that helps us unfortunates who were trained on M$ across the line with Linux?
The second comment that cited the article as pro M$ made me laugh, and the response to that is typical and I didn't read any more of the inevitable OS flame wars. Could we have a clear concise article on what they should have done in the transition from NT4 to Linux or even better...the same article covering how to transition from what they have now to Linux.
We use Fedora 9 in a VM at work, on a M$2008 server. I'd like to go away from M$ servers, retain .net stuff and move a lot of stuff to php. Troubles is the ratio of info about 'How OS xyz is better'n OS $' to 'How to architect a change to OS xyz and why' is about a hundred to one. I know mono may help me but I am having trouble finding time and information because juvenile jingoistic OS pundits write reams of crap. Help I'm drowning in FUD, and some of it is open source...
I will be revising my Unix Guide to Defenestration before serializing it here later this year - and that book, originally written in 1999/2000, is dedicated to meeting his needs.
Notice that I'm not concerned, and I assume Leigh isn't either, with the specifics of individual conversion processes - i.e. the question isn't "how do you convert from .net to mono?" but "how do you convert from a Wintel orientation to a Unix one?"
The single most important pre-condition for success in doing this is to be very clear that Unix is not Windows, not zOS, and not VM - and that what you know about managing these other architectures has co-evolved with those architectures and therefore may, but more likely will not, apply in the Unix world.
Some skills and ideas are transferable, but many of the management and technology assumptions, particularly those people are surest of (and are therefore least likely to think about) are likely to prove wrong in subtle and devastating ways.
Two very basic differences, for example, tend to so utterly confound data processing (and now Wintel) people that they never learn to use Unix properly:
- With Windows (and zOS) you care mostly about applications, with Unix you care mostly about users.
This has many consequences; for example, the natural thing to do with Windows (and any other data processing architecture) is to put your people as close to the gear, as you can - where with Unix you do the opposite: spreading your people throughout the organization by putting them as close as possible to users.
- With Windows (and zOS) the IT job is mostly about managing the IT resource: infrastructure, people, and applications - but with Unix, the IT job is mostly about serving users.
Both of these are consequences of differences in costs, risks, and performance. With zOS adding another job can upset a delicate balance between limited time and expensive resources; in Windows adding an application for a few users can have unexpected consequences across the entire infrastructure, and, of course, in both cases performance and flexibility are limited while change costs are high.
In contrast, the risk of adding a new application in the ideal Unix environment - big, central, processors with smart displays - is trivial; and the cost of doing things like creating a container for users who want "the database" as it was last February 19th at 10:00AM please, is, like its performance impact, essentially zero.
From a grunt's perspective the key operational difference is that, with with Windows you spend most of your time keeping things working -but, with Unix you set systems up to work and trust that they do, thus freeing yourself to spend most of your time, not in futzing with the system, but as the human facilitator in the system's interface to users.
As a manager the big difference between Unix and traditional data processing gets expressed most clearly in the default response to user originated change requests. With zOS (and now Wintel) the risks, and costs, of change are so high that the right answer is almost always "no" - and shunting persistent requesters into the budget process is an appropriate learned reflex because it works to provide both budget growth and time.
In contrast, Unix costs and risks are so low that the right answer is almost always to simply say "yes" and move directly to the how and when.
This difference has a major organizational consequence with respect to role separation. When Finance spun out data processing in the 1920s, role separation naturally came along - and is still embedded in the CoBIT/ISACA data center operational standard. Unix, however, came from the science side and has no evolutionary history to justify any of this - meaning that the right thing to do is to wear a suit and a bland look in meetings with your auditors, but actually cross train everyone to do just about everything while leaving within team assignments for team members to sort out among themselves.
In practice, of course, you see rigid role separation applied to Unix, but this is almost always the result of organizational evolution and the decision making roles played by people whose assumptions reflect Finance, audit, or data processing backgrounds. In general what happens in those cases is that Unix gets used as a cheaper something else - and that can work, but doesn't take advantage of the technology's real strengths.
Most IT executives find it extraordinarily difficult to accept that you get the best results with Unix by cross training your people, freeing them to make their own operational decisions, and centralizing processing while distributing real functional control to people working one on one with users; but this is the only known route to making corporate IT what it should be: a cheap, fast, and trusted "nervous system" for the business.
As I say in defen, the difference is that between management and leadership. With management you organize to get a known job done in repeatable, controllable, ways -and that's the right way to address something like printing customer lading reports for a railway in the 1920s: you train people to operate each machine, put someone in charge of each line, yell "go" at the right moment, and then wander around ensuring that each batch gets handled "by the book" by exactly (and only) the right people at each stage from keypunch to print distribution.
With IT, however, the job changes daily and you need leadership: the process of focusing more brains on goals; not management: the process of organizing hands to execute well understood processes. Thus the very basis of applying data processing methods, whether with zOS or Windows, is antithetical to the IT job - and therefore to Unix as a tool for doing the IT job. Basically, most corporate data processing is organized and equipped to pound square pegs into round holes - and thus the amazing thing about them isn't that they constrain organizational change while costing too far too much and doing far too little, it's that they work at all.