Resolved: Unix disrupts organizations by flattening hierarchies

Resolved: Unix disrupts organizations by flattening hierarchies

Summary: You won't find an organization of any scale, say 500 employees and up for the entire period,which did not strengthen heirarchial controls with every new generation of Wintel products brought in - from PCDOS to Vista

SHARE:
TOPICS: Hardware
21

 

Resolved: Unix disrupts organizations by flattening hierarchies

Be it resolved that properly managed Unix infrastructures disrupt organizations by reducing the importance of hierarchal reporting relationships as information conduits in an organization.

There are lots of subtle components to this. First, we need to agree that organisational hierarchies exist primarily as information conduits linking a small number of people at the top to a large number at the bottom: thus the king's mistress tells him she'd look good in a Paris gown, he tells his nobles about his new ambitions, they tell the army captains, who tell the guys that actually go out and invade France.

Or, in a more modern context, the retail scanner's output triggers a gown re-order, which ultimately leads someone in Suzhou to throw a few too many mulberry leaves into the silkworm cages - which ultimately leads to a price reduction at the scanner.

Basically both cases illustrate heirarchial information flows from customer to supplier, and both illustrate that organizations exist mainly as mechanisms whose role is the heirarchial distribution and collection of information.

In that context any good IT infrastructure acts as an organisational nervous system, transferring information from one part of the body corporate to another and so the question really is whether doing Unix right has a systematically different effect on the organization than some other architecture, say Microsoft client-server.

I say it does: specifically, that doing Windows right requires a strong hierarchy because organisational use contradicts the inherently single user nature of the Windows PC -while doing Unix right promotes sharing and thus peer to peer relationships among the people involved.

To see the first half of this, think about the first PC-DOS PCs entering user hands: the mantra, set up in opposition to central control, was "one man, one computer" -something that gave countenance to the lie built into the name "personal computer" for an organisational resource, and combined with the technology's lack of networking to set the thing's tendency to isolate users from one another in emotional concrete right from day one.

It's still largely like that - even with today's born again mainframe centralisation, stereo-typical acolytes, fully locked down desktops, and out-sourced helpdesks you get users pretending that their desktop computer is somehow more theirs than the organization's - a delusion that survives everything from DHCP boot and regular use of that PC as nothing more than an expensive and somewhat disfunctional terminal, to repeated management slapdowns for trying to install or run unapproved applications.

Bottom line: you won't find an organization of any scale, say 500 employees and up for the entire period, which did not strengthen heirarchial controls with every new generation of Wintel products brought in - from PC DOS to Vista.

What's going on with that is simple: the PC sales pitch still thrives on the one man, one computer idea - but in action that isolates individuals from each other, thereby contradicting organisational agendas, and that forces the organization to strengthen its internal controls in response. It's a classic negative feedback loop: the more PCs you bring in, the more pressure toward individual isolation, and the stronger countervailing organisational controls required for things to work.

Unix, of course, went the other way right from the beginning - As Dennis Ritchie put it: "What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form."

That doesn't mean every Unix installation acts as an information switch - most of the managers come from the windows/data processing world and haven't a clue - but it does mean that Unix naturally unifies where wintel naturally divides, and thus aligns itself with achievement of the information communications objectives underlying organisational design.

In practice what that means is just that organizations switching to Unix enterprise desktops find that they need fewer layers of management to control first IT, and ultimately the business itself.

The IT end of this is obvious: no desktop OS means no helpdesk, no reboot/reload cycle, no network mess, no complex data storage or backup, and no army of support techs to manage.

The absence of that management layer means that the people who do the work, can direct the work - meaning that users can communicate directly with the people who make things happen and the strict heirarchial controls used in other environments gradually gives way to a more relaxed and co-operative work environment.

Ultimately that puts users in control of IT - leading to things like Monday's resolution: immediate action by the responsible sysadmin or DBA on user requests that would otherwise have to be booted up a hierarchy and decided by a committee. It's the universal experience: put users in charge of IT without turning them into IT people and they do exactly what Dennis Ritchie and his colleagues had in mind: they form fellowships - people in production working with people in engineering, engineers working with procurement, sales people working with logistics people: all co-operative efforts formed in peer to peer links crossing organisational lines - and obsoleting the official hierarchy as the organization's primary conduit for information interchange.

 

Topic: Hardware

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

21 comments
Log in or register to join the discussion
  • Don't Follow

    Sorry, I just don't see the connections.
    Erik Engbrecht
    • ok - try this

      1 man 1 computer => fragmentation
      orgs respond to fragmentation by strenthening hierarchies, imposing additional controls

      Many people: 1 computer => co-operation and community
      Orgs respond by relaxing heirarchial controls
      murph_z
      • still doesn't follow

        The basic metaphors used for computing don't change, so people will still think about it in the same way.

        You get more reliability and flexibility with Unix, but that doesn't require centralization of computing resources.

        For many computing tasks, I would argue the location of the computing resources shouldn't matter to the end user. What matters are the performance characterstics the user perceives from his terminal.

        Furthermore, I would argue that a distributed computing model could produce a much more reliable, performant, scaleable, and cost effective solution.
        Erik Engbrecht
        • umm. let me suggest some reading

          McLuhan's Understanding Media applies to organizational structures and the impact of systems design too. It's hard, I know, but
          hey it'll be good for you...

          Remember: I'm not talking about the operation of any solution here, I'm talking about the effect the thing has on the organization it's supposed to serve.
          murph_z
          • you got it backwards

            You're suggesting obsolesence.
            Why are you still stuck in the dark ages.

            Maybe you should read up on newer technologies. It's hard, I know but hey it'll be good for you...
            code_Warrior
          • You've said this before

            but repetition doesn't make it less silly - I think you're confusing management methods with technology. Mainframe management methods suck - they're from the 1920s (literally) and reflect agendas that haven't been appropriate for 40+ years. But Unix enterprise desktops aren't from that culture at all - they orginated with science based computing and have nothing at all
            to do with data processing.

            Remember: computing and data processing both use computers, but that's where the similarities end. Don't confuse some overlap in technology with overlap in methods, agendas, or results.
            murph_z
          • Might I suggest...

            "The Myth of the Machine", by Lewis Mumford, to provide some balance to McLuhan. His thesis is that the advancement of civilization (and I think it would apply to the organizational structures you speak of) should not be measured by the technology we've developed, but rather by the symbolic structures we've developed in our cultures. He sees technology as merely an enabler, helping us implement what we understand, and that it has minimal power to change us. You could hand the best technology around to someone with minimal understanding and they'll still fail to use it to its potential. I think his theory leaves open the possibility that technology can free people to entertain new ideas that they did not feel free to think of before, but ultimately it's people that change their point of view. Mumford didn't think that technology had that sort of power.

            You've said it yourself over and over again: Hand Unix to a bunch of data processing professionals and they still misuse it.

            I don't think this invalidates McLuhan's theory. Technology does convey a message. I've experienced it, but I think not everyone reacts the same way to the message, because they're not receptive to it. They may lack sufficient understanding to receive it.
            Mark Miller
        • Thanks for singing my tune!

          ;)
          Roger Ramjet
      • Not organizations

        You wrote;

        "1 man 1 computer => fragmentation
        orgs respond to fragmentation by strenthening hierarchies, imposing additional controls"

        That's not a response from the full organization; it's a response by people who are determined to have control over IT within IT.

        The organization resists in a variety of ways, all of them intended to reduce the effects of actions taken by those making themselves antagonists.

        Murph, you've discussed the successful rebellion often enough. Why recur to an obsolete model? Doesn't that prevent you from discussing a Unix-based approach's problem, the threat of removal of autonomy from those who fought to win it?
        Anton Philidor
        • Nope: wrong perspective

          I am talking about organizational structure, not IT vs the world;
          I know you're trying to cast this in a way that lets you make your point - but try to think a little more globally.

          Here - you've just been given a job: design a 5000 employee company to exploit moon dust... ask yourselves how the divisions come together, what controls you need. etc etc etc -then discuss the issue of centralizing tendencies vs fragmentation.
          murph_z
          • Hmmm... no, I think Erik has it right.

            What you're talking about is centralizing computing resources, but that doesn't necessarily change the way people interact with their machines or each other (nor does it necessarily require Unix). Erik is right in pointing out that the user need not know (or care) where the computational tasks are being performed. For all they care, "the computer" is the monitor and keyboard; any more detail is TMI.

            It is blatantly obvious to even the most casual observer that "one person, one computer" is functionally indistinguishable on every level from "one person, one workstation". The only functional alternative, "many people standing in line for a computer" is an absurd bottleneck we discarded decades ago.

            That being the case (and it [b]is[/b]), there's no inherent difference between an organization using centralized IT vs. one using distributed IT. (That is, no difference to anyone except the minority of workers in IT.) The users are going to use their machines as they always have. You can [i]try[/i] to prevent that by standardizing on specific tools, methodologies and procedures, but in the real world (vs. some academic thought experiment) this foray into medieval structure will only encourage your users to look for their solutions outside of the box you've constructed.

            We have historical demonstrations of this. There are [i]reasons[/i] we have distributed systems today: one size does not fit all; no procedure is appropriate to all situations; and PCs provided the kind of flexibility that's necessary and sensible and wasn't provided in timely fashion by centralized IT. In response, modern centralized systems provide that kind of necessary operational flexibility while maintaining centralized control over the plumbing. While this is significant to IT organizations, nobody else in an enterprise really gives a rat's behind. As a result, the connections I think you're trying to make here no longer exist. Unix is no longer disruptive. And organization hierachies do not have to change in response to said non-disruption.

            This is even more obvious when we consider that organizational structure is defined by the need for organizational entities to interact. These entities are composed of [i]people,[/i] not workstations. And these people do not confine their interactions to the "nervous system" defined by their IT plumbers; rather, they interact through personal contact (meetings), email, IM, phone calls, as they themselves choose; through conduits they provide should IT fail to do so. It is ridiculous to suppose that it should matter to them at all whether that communication is provided by named pipes vs. peer-to-peer IM vs. little digital mailmen hand-delivering emails. [i]They don't care.[/i]

            Corporate hierarchies are far more strongly determined by factors such as the number of direct reports assigned to a manager and the [i]variety[/i] of those direct reports. For instance, one retail chain regional manager can handle a large number of store managers, as they all perform similar functions. Another manager may have authority over a smaller number of disparate departments, as increased variety introduces complexity into his interactions. These, of course, are functional and business considerations that have nothing to do with IT. And that's OK. For the most part, IT should be invisible (by which I mean "taken for granted"). If it's not then you're doing it wrong.
            dave.leigh9
          • Right premises, wrong conclusions

            I agree with everything you say about users and about the role of IT, but the conclusion you draw from this: that internal delivery structures don't affect external usage structures simply isn't supported by your premise.
            murph_z
          • If so, prove it.

            [i][b]I agree with everything you say about users and about the role of IT, but the conclusion you draw from this: that internal delivery structures don't affect external usage structures simply isn't supported by your premise.[/b][/i]
            Oh, I think it is. I've been around way too long and have seen far too many IT tails trying to wag their business dogs to conclude otherwise. In every case the observation of my last paragraph has held true; that in any modern computerized environment the complexity of the personal interactions based on operational roles overshadows any infrastructure these people utilize for communication.

            You make an extraordinary claim: that it matters -- not just to the business users -- but to the [i]organizational structure[/i] whether there's a PC connected between the screen/keyboard and the wall or not. I'd say that requires extraordinary evidence. I haven't seen it.

            Just in case you think I've missed some extraordinary argument already, let's look at some of your statements:
            [b][i]First, we need to agree that organisational hierarchies exist primarily as information conduits linking a small number of people at the top to a large number at the bottom:[/i][/b]
            No, we [i]don't[/i] have to agree there, in that "conduit" is the wrong metaphor, It implies unimpeded, unfiltered flow of information. In practice it is vital for hierarchal levels to act as amplifiers and filters. Amplifiers for the information going down the chain and filters for the information going up. This is due to the fact that strategic directions are inadequate to execute a plan, and far too much information exists at the lower levels to be useful to strategic planners. This is [i]why[/i] personal interactions trump infrastructure. This is [i]why[/i] individual job complexity limits how flat an organization can be.

            [b][i]...doing Windows right requires a strong hierarchy because organisational use contradicts the inherently single user nature of the Windows PC...[/i][/b]
            You completely confuse the operating system with operational tools and procedures here. For instance, standalone PCs (which could just as easily be Macs, Linux, or Solaris boxes, BTW... the core of your argument is about an architecture, not an OS) may very well be used, but that doesn't mean that the processes employed there (or the tools) are disconnected and exist in a vacuum. Anyone familiar with corporate IT is familiar with the concept of workflow, in which an action on your part prompts an action by someone else on the network. I manage and create workflow-enabled apps (in Notes/Domino) everyday. You most certainly don't [i]need[/i] Unix for that (and speaking as somebody that likes Unix, I simply can't pretend otherwise). The existence of workflow [i]alone[/i] decimates your argument, because the connections you claim reserved for Unix exist for [i]any[/i] connected architecture.

            [b][i]...regular use of that PC as nothing more than an expensive and somewhat disfunctional terminal[/i][/b]
            If we take you at your word and all ow that the PC is commonly [i]used as[/i] a terminal, then what is the functional difference between that and using an [i]actual[/i] terminal? Where is the pressure sufficient to change the layout of the org chart?

            [b][i]the more PCs you bring in, the more pressure toward individual isolation[/i][/b]
            In this observation you're contradicting your earlier observation about PCs used as a terminal. We already know that PCs are connected not only though terminal and web services, but also through client/server and peer-to-peer technologies. You [i]say[/i] there's pressure to work in isolation, but business processes require employing these various connections, so the observation lacks verity.

            The one argument that might be effective -- that Unix allows people to handle more complex relationships -- wouldn't fly either, since this is more a function of applications than architecture. Perhaps you have a "home run" argument that justifies the "Resolved" in your subject line; if so, I'd love for you to make it.
            dave.leigh9
          • More

            Yes, client-server is client server and about equally counter-productive no matter how you implement it. I used windows simply because that's common.

            Most of your argument (which is pretty good) boils down to denying the belief that organization are information processing devices, and therefore denying the conclusions I draw using this as an apriori belief. That's outside the scope of this argument, but I'll try to bring it back later.

            Your question: if I say that PCs are generally used as expensive terminals, then how can I maintain that they drive isolation and cost exemplifies the communications failure we have here. What makes the PC an expensive and dysfunctional terminal is its operational complexity. That complexity then drives costs, and the response to those costs drives IT management attitudes and infrastructure - it's why they like lockdowns and couldn't care less about what users want since any user want is likely to be an IT cost.

            With a real X-term those desktop costs simply don't exist - no security issues, no local software issues - nothing. As a result the presures these costs create on management don't exist, and therefore neither do the represive responses. So what happens instead? Unix unites where windows divides - even if you use windows as a terminal to Unix because it's not what the user does with it that counts, it's what it costs IT to make any usage possible that counts.
            murph_z
          • Good discussion

            [b][i]Most of your argument (which is pretty good) boils down to denying the belief that organization are information processing devices, and therefore denying the conclusions I draw using this as an apriori belief. That's outside the scope of this argument, but I'll try to bring it back later.[/i][/b]

            Actually, the core of my argument is that I deny your apparent assumption that the organizational connection between entities in an organization are primarily dependent on information systems. I hold that the lines on that org chart represent interpersonal relationships and do not represent IT conduits.

            The points you just raised concerning costs and complexity are certainly valid arguments for adopting flexible centralized IT systems. That's good in itself, but it is what it is: those arguments don't touch those interpersonal relationships on the org chart. I say that there are innumerable ways of facilitating those relationships.

            Nevertheless, this is the sort of conversation that gets us thinking consciously about things that are normally simply felt in the gut. That's a good thing. Whatever your conclusions, thinking them out helps to turn assumptions into real decisions.
            dave.leigh9
  • System would not be w/o user errors

    Since you've been advocating the Sun Ray system, I assume you are talking about that setup. I just remember that many years ago when I was in college we had a setup similar to what you describe. Even when only programming students were involved, people who were technically knowledgeable, they still managed to screw things up, and support staff had to intervene to fix a corrupted account, or reboot a system, though reboots were rare. Problems with students running rm -r * at the root of their account and losing all of their files were rather common.

    As you suggest, the support staff was small. And by the way, not all the terminals were dedicated X terminals. Some were what were called "diskless clients". I'm not sure why, because they had hard drives in them. They were full-fledged desktop Unix machines. I believe every student had their own accounts on them (set up by a centrally automated process). People tended to log in on them and use them as X terminals, but occasionally we ran software locally on them.

    I had a brief stint as a Jr. Unix Admin. at another university, while still in college, and while the support staff was small, every day we got problems to solve for different clients, all of whom were using terminals to access accounts on Unix. The job was never done. Occasionally somebody would mess up a system, which would have to be rebooted, but most of the time the requests were for configurations, or "I want to do X. How do I do it?" So sometimes we doubled as a help desk.

    In both situations, one of the things that became rather common is the support staff had to go after people for doing malicious things. Since the machines were networked, and open, it was possible to take a bitmap graphic and display it on someone else's screen without their permission. Sometimes things got dicey, with male students putting porn images on a female student's screen--and this was back in the early 90s, before the internet (and internet porn) got popular. Anyway, sexual harassement actually became an issue in the computer lab.

    At my Admin. stint I witnessed someone trying to forge an e-mail to someone else (the sender posed as a different sender), using their "inside knowledge" of Sendmail.

    In both cases it was up to the staff to discipline these folks. It was not the picture of harmony that you paint here. Because the systems were set up in the spirit of openness, and therefore the honor system, some people abused the power available to them.

    Even for me, a little knowledge was a dangerous thing. I remember as a freshman one night using a utility (I think it was called "write") that allowed me to send a message to any user who was logged in. It would display directly on their shell screen as soon as I sent it. I was done with my programming assignment, and I assumed (naively) that everyone else I knew was, too. I sent some messages to a couple other folks, asking whether they thought the assignment was hard, etc. Just chit-chat. I also assumed they knew what I did about sending messages. Boy, was I wrong! I got a couple students mad at me, because they thought my messages, which came up on the screen where they were editing their work, had ruined what they were working on! My messages were overwriting their text on the screen. What they didn't realize was that the editor they were working with had no knowledge of my messages, and so did not enter them into their text. It didn't matter. They went by what they saw on the screen. Oh well. I realized it was best to be merciful with students who weren't as into learning about Unix as I was.
    Mark Miller
    • yeah.. so?

      Write (and wall) provided text messaging as early as
      the mid seventies - and just because some people didn't know
      how to use it, doesn't mean that there was anything wrong about the tool.

      My personal favorite in terms of bugging other people was a thing called eyes - which put a pair of eyeballs on the remote terminal that then followed that user's mouse.. and if they had sound you could make it say things like "I'm watching you.." at start-up.

      Horrible huh? But all of the stuff you mention (and worse stuff most of us did) can be, and is, done more maliciously and untracably with PCs so I don't see any of it as an argument against Sun Ray.

      More importantly, take a look at what you wrote and what it says about how you worked with users: yes you offered help desk services, yes you rescued people from their own mistakes, and yes you always had something to do - now how does that differ from the PC world? First: you had closer user contact and greater responsibilities - and, I'll bet, you had more fun and your users were better off.
      murph_z
  • McLuhan ROFL

    I couldn't stop myself paying Paul some more money.

    Marshall McLuhan - the Andy Warhol of the English language and sociology set. Known for his great theories with no informational content.

    Gee Murph, I think you'll find Marshall's been out of favour as long as von daniken. What's next? Seeing any good cults to join? Now I actually had to study that crap (because I have expertise in other fields besides obsolete computing) - what's your excuse?

    This was 1964 Murph and he really didn't get any better. Sure he came up with the phrase global village and the meaningless the medium is the message and that's about it.

    But then he's probably your hero Murph, because he was always interested in talking about things rather than actually doing them.
    TonyMcS
    • You're 0 for 2 on the quotations

      And if you don't know his most often quoted stuff, why do you think you know enough to run down the other 99.999999999% of what he wrote?
      murph_z
  • Cost and Risk Avoidance

    Murph,
    Based on some recent posts, it seems that you are arguing that centralizing computing resources on a Unix-based infrastructure significantly reduces the cost and risk associated with the PC-centric architecture thereby freeing IT to stop combating Windows armegeddon due to the next piece of malware and start working to better enable information flow throughout the organization.

    You make several assumptions:
    1. Centralized computing is lower cost/risk than any distributed computing model.
    2. IT resources liberated by switching to centralized computing will be reallocated to addressing business problems through techonology.

    I don't buy #1 on technical grounds. Most of the savings will be due to reducing Windows, not centralizing computing. Furthermore, I believe that ultimately a distributed model with well architected and managed infrastructure can be far more secure, reliable, cost effective, and scaleable than a centralized model. In fact, I'll contend that belief that centralized processing is superior is based the same superstition that causes users to want PCs - that whomever has physical control of the hardware ultimately governs its use.

    #2 doesn't hold water either, because the skillset required to keep Windows alive is different from the one required to solve business problems, so a substantial portion of IT will have to be let go regardless of whether there is a reduction in budget or not. Honestly, even if there wasn't an immediate reduction in budget, I highly doubt sufficient quantities competent people could be found to fill the new positions.
    Erik Engbrecht