IT Commandment: Thou shalt honor and empower thy (Unix) sysadmins

IT Commandment: Thou shalt honor and empower thy (Unix) sysadmins

Summary: The sysadmin has to be empowered to act, within limits but immediately, unilaterally, and effectively, on user requests

SHARE:
TOPICS: Data Centers
34
In data processing your machine operators are nobodies - essentially semi-skilled labour invisible to anyone outside the glass room. In science based computing (i.e. Unix), however, your sysadmins are the people who work with the user community to make and implement the day to day tactical decisions characterizing successful systems operations.

IT CommandmentsDo data processing right and what you get is the intensely hierarchical mainframe (or Windows) data center with no user control, inflexible budgets, locked down desktops, and long duration change management processes that turn a request to add or change a minor function into a formal project.

Get science based processing right and what you get is centralised processing with decentralised management: combining the user control promised by the PC with the security and cost minimisation of Unix.

In the real world you don't see much sysadmin empowerment outside small businesses because big companies tend to put data processing people in charge of computing, and they see sysadmins as machine operators.

At the root of all this is a fundamental confusion: data processing is not computing.

In fact, about the only thing data processing has in common with computing is the use of computers. Unfortunately, many of the top executives ultimately responsible for computing infrastructure decisions in big organizations simply don't know this, and so rely on advice from data processing people to put data processing people in charge - and that ultimately produces the Unix data centers we generally see: metaphorical Boeing 787s towing railway cars, with users conditioned to expect, and therefore accept, defensive arrogance, missed deliveries, and inflexible routing.

Data processing started with mechanical tabulators in the nineteenth century, developed into a professional discipline during the 1920s, and became a cornerstone of business operations during the late nineteen twenties and early thirties. That was the period, too, in which fundamental organisational structures (like stringent role separation), fundamental controls (like the service level agreement), and fundamental functional assumptions (like the emphasis on reporting, the focus on processing efficiency, the reliance on expectations management, and tendency to justify costs on layoffs) all became more or less cast in stone.

When computers entered this picture in the late 1940s they weren't used to replace tabulators, they were used to control tabulators - and cost justified on layoffs among the people who had previously controlled batch processes. Thus the first assemblers were physically just that: controls enabling the automated assembly of jobs from card decks and the transfer of information from the output of one batch run to the inputs of the next one.

Out of that comes the modern data processing center: with JCL replicating much of that original control structure, COBOL directly implementing many mechanical card operations, rigid role separation now enforced in the software, and thousands of supposed applications replicating the original card deck processing and assembly routines but cumulatively forming only a small number of real applications.

Science based computing had nothing to do with any of this and focused, from its origins in the mid to late thirties, on problem solving and the extension, rather than the replacement, of human ability. Thus when Atanasoff and Zuse dreamt of solving computational problems, Shannon applied computing to communications, or Newman used a Collosus in raid planning, none of them had the slightest interest in financial reporting or other commercial tasks.

That science community focus continues with Unix today - particularly in the BSD and openSolaris communities - but it's been there from the beginning. Thus when Thomson and his colleagues first worked on Unix, they talked about forming communities:

 

From the point of view of the group that was to be most involved in the beginnings of Unix (K. Thompson, Ritchie, M. D. McIlroy, J. F. Ossanna), the decline and fall of Multics had a directly felt effect. We were among the last Bell Laboratories holdouts actually working on Multics, so we still felt some sort of stake in its success.

More important, the convenient interactive computing service that Multics had promised to the entire community was in fact available to our limited group, at first under the CTSS system used to develop Multics, and later under Multics itself. Even though Multics could not then support many users, it could support us, albeit at exorbitant cost. We didn't want to lose the pleasant niche we occupied, because no similar ones were available; even the time-sharing service that would later be offered under GE's operating system did not exist. What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.

(Emphasis added)

Today that's exactly what a good Unix sysadmin does: facilitate interaction among a community of users by continually adjusting services to meet their needs. As a result that sysadmin has to be empowered to act, within limits but immediately, unilaterally, and effectively, on user requests.

In contrast most machine operators have been replaced by automation, but the remaining few have no decision role, no access to users, and no control even over their own jobs.

Thus a CIO's job in a defenestrated data center is fundamentally to make sure that the sysadmins have the access, the skills, and the resources they need to respond to users - and that's the opposite of what's needed in data processing where IT management focuses on husbanding a scarce resource and deploys "client facing" people and processes mainly to buffer change in resource demand.

Confuse one with the other, try to apply lessons learnt in a hundred years of data processing to computing, and what you get is what we mostly see in larger data centers: Dilbert's world of tightly locked down desktops, long change processes, powerless systems people interacting with equally powerless users, and ever escalating costs. To fix that: adopt Unix; put science people in charge; turn IT inside out to push services and resources at userso instead of focusing inward on resource stewardship; and empower the sysadmins to work directly with users, figure out what the job is, and get it done.


Our IT Commandments:
  1. Thou shalt not outsource mission critical functions
  2. Thou shalt not pretend
  3. Thou shalt honor and empower thy (Unix) sysadmins
  4. Thou shalt leave the ideology to someone else
  5. Thou shalt not condemn departments doing their own IT
  6. Thou shalt put thy users first, above all else
  7. Thou shalt give something back to the community
  8. Thou shalt not use nonsecure protocols on thy network
  9. Thou shalt free thy content
  10. Thou shalt not ignore security risks when choosing platforms
  11. Thou shalt not fear change
  12. Thou shalt document all thy works
  13. Thou shalt loosely couple

Topic: Data Centers

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

34 comments
Log in or register to join the discussion
  • SOx handcuffs

    The new SOx regulations have further eroded the traditional system admin's role. Instead of being the one-stop-shop keeper of root, the system admin is being pidgeonholed into "roles" - where root access is locked up, and sysadmins get some sort of "pseudo root" access. Those "data processors" that run the show are becoming the equivalent of Nazis (or Bushies - your choice), where everything is monitored, every move is scrutinized, and all access is doled out on a need-to-know basis.

    Security IS important! And most (if not all) sysadmins understand this implicitly. <b>I have yet to find an untrustworthy (L-unix) system administrator!</b> I suppose that they COULD exist, its just that after almost 20 years, I would be hard-pressed to point one out. Every sysadmin I have ever met are dedicated to keeping the systems running smooth and secure (even the lazy ones).

    I had a chance to re-join an operations team and become a full-time sysadmin again. I thought long and hard about it, but with today's restrictions - and lack of support from management, I decided that paper-pushing would be better for my health. Sad, VERY sad . . .
    Roger Ramjet
    • Agreed

      On both issues:

      - yes, most sysadmins are good people; and,
      - yes, SOX has proven to be useful in data processing's struggles to treat Unix as if it were MVS and use it to bring back the 70s.
      murph_z
      • Here's how I see it

        Windoze had a lot of security issues, so company "F" came down hard on . . . UNIX.
        Roger Ramjet
    • Sox

      As a SOX IT auditor, I'm sorry but you have to just suck it up. I'm tired of hearing IT folks whine that all their access is being taken away and they can't do what they've always done...which is whatever they feel like. They may be trustworthy, but that doesn't mean they always do the right thing. I've seen way too many mistakes ($$$) by some well intentioned admin.

      I always find it interesting how the IT groups always complain that nobody else understand IT and what they have to put up with from the rest of the business. And yet they can never turn that around and see that IT usually doesn't know much about the rest of the business...and how their mistakes can have significant impact to the accounting, billing, and finance functions.
      jshaw4343
      • No one's arguing with you on this

        This column is about how to do IT RIGHT, you're talking about people doing it wrong - an overwhelming majority.

        You should be aware, for example, that nominal SOX best practices derive ultimately from the CoBit work which in turn reflects data processing, not computing.
        murph_z
  • Alternative solution.

    As I understand it, your plan is to make the sysadmin into an investigator who walks out into the rest of the business, inquires for what's wanted, walks back into the data center, and figures out how to make things happen.

    You do realize how locked down and IT-centric that approach is, right?

    Here are the quotes.

    First:
    Thus a CIO's job in a defenestrated data center is fundamentally to make sure that the sysadmins have the access, the skills, and the resources they need to respond to users - and that's the opposite of what's needed in data processing where IT management focuses on husbanding a scarce resource and deploys "client facing" people and processes mainly to buffer change in resource demand.
    EoQ

    So the sysadmins will respond to users instead of setting up procedures to push them away.

    Second:
    To fix that: adopt Unix; put science people in charge; turn IT inside out to push services and resources at users instead of focusing inward on resource stewardship; and empower the sysadmins to work directly with users, figure out what the job is, and get it done.
    EoQ

    The users yearn, the sysadmin solves.


    Here's an alternative: give the users the resources to solve their problems themselves as quickly as they can and as responsive to their needs as they want.

    Recently Mr. Ballmer gave a speech saying exactly that to CEOs.

    According to Mr. Ballmer, IT sections had urgently asked him to give such a speech, because IBM continuously campaigns to CEOs for the use of consultants and services outsourcing.

    The IT sections needed the "air cover" to keep control over IT by showing that it could empower people. They'd handle the technological aspects of purchasing and using Microsoft products, but needed help being able to retain authority for purchasing decisions in the first place.

    As all good IT sections say, Thank heaven for Microsoft, the way to defeat what you're calling the "data processing" model.


    Unix? That's an old mainframe language, sort of like COBOL, isn't it?!
    Anton Philidor
    • Dead wrong

      Try doing something of your own on a PC in any large business or government, and you'll discover that control has been centralized -even if some processing hasn't yet been.

      Windows is becoming the 70s mainframe.
      murph_z
      • That's not by user request.

        Yes, some IT sections do take as much control as possible away from users. Some IT sections are already being newly led by non-IT staff, and these two facts may be connected.

        You saw the article about CIOs searching for people with a business as well as an IT background. That's to make certain they understand what's being told them when they go walkabout.

        The CIOs have even hired people from other fields to lead IT projects, maybe because IT staff might be too IT-centric.

        You've identified a significant problem in excessively centralized (in IT) control.
        But is that a problem that's solved organizationally or by going from user-directed Microsoft products to IT-directed Unix products?

        Because of all Unix's virtures, I want to see Unix-based products as user friendly as anything Microsoft and other pc vendors produce. (And given the terrible elaboration of Photoshop, for example, that should be easy.)

        I think that we have to agree that's not happening as much as it should be.
        Anton Philidor
        • Those who can, do

          [You saw the article about CIOs searching for people with a business as well as an IT background. That's to make certain they understand what's being told them when they go walkabout.]

          That is what us old-timers call MIS - a VERY separate discipline from CIS. MIS students take a bunch of business classes and a handful of low-level CIS classes (no compiler-design here!). To say that these people "understand what's being told them when they go walkabout." is akin to saying "George Bush knows what he's doing in Iraq because he's commander-in-chief".

          [Because of all Unix's virtues, I want to see Unix-based products as user friendly as anything Microsoft and other pc vendors produce.]

          To me, UNIX <b>IS</b> "user friendly". I can do ANYTHING I want - quickly and easily, by typing in a few commands. Yes, a CLI isn't for everyone - but it is the epitome of "easy". I learned UNIX the old-fashioned way - there's your machine, and oh yeah, use the "man" command. The learning curve was steep, but I retained it all and now its a no-brainer to write scripts quickly to do things I want to do. What could be easier? Use man to find the parameters to the command - or fumble around with different pull-down menus to find what you *may* want - and may never find? "Easy" is in the eye of the beholder.
          Roger Ramjet
          • Agreed

            On ease of use and scripting.

            re: the intermediaries: this is like putting business people in as CIOs, it just means that somebody else is doing the real job.
            murph_z
          • I worked with an MIS guy once...

            I almost killed him because no matter how hard I tried I couldn't get him to understand recursion.
            Erik1234
          • Hey, I know him..

            or, at least, a few hundred just like him.

            On the other hand, you want a real challenge? Try teaching a 20 year COBOL vet to program in APL!
            murph_z
          • This particular COBOL veteran

            Is more interested in modern languages than like tutorial D or TLA+ than archaic relics like APL. Being responsive to user requirements is pointless if you cannot guarantee the logical consistency of what you are doing.

            By the way Paul do you actually understand what Edsger Dijkstra's objections to COBOL were or are you just repeating widely held prejudices?

            Dijkstra on COBOL: "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence"

            Dijkstra on APL: "APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums"
            jorwell
          • Recursion, or...

            ...why I can do in 5 lines of code what you can't quite do in 50...
            Erik1234
          • Why is terseness important?

            If recursion is the logically correct way to implement something then it should be used, not because you need to write less code.

            However many problems do not require iteration or recursion yet many programmers still seem to be strangely enthusiastic about procedural methods rather than declarative approaches.

            On the whole I see no need for iteration, recursion may occasionally be useful.

            Still, lets do away with the dirty, unstructured things we have in C and COBOL (pointers and goto).
            jorwell
      • But that's...

        ...a function of management, not the OS everyone is running. The OS has ABSOLUTELY nothing to do with it. I guarantee that any Unix-based network could be managed in exactly the same manner. In fact, the "Sun Ray" smart display setup you've been touting would be managed centrally also, wouldn't it? How is that different?

        Carl Rapson
        rapson
        • The difference is in possibility

          People using wintel cannot do distributed management - not for cost reasons, not for security reasons, and not for simple "getting it to work" reasons. Control centralization is imposed by the technology, do anything else and your costs go up and up ...

          With Unix and smart displays there are no such limits. You can manage centrally (replicating the old mainframe approach) or you can decentralize management -the technology supports either approach and doesn't provide cost incentives to centralization.

          Basically, with Unix you can, with Wintel you cannot.
          murph_z
          • Not the same thing

            Your article isn't talking about distributed management, it's talking about 'locking down' peoples' desktops. Those are not the same thing. There is nothing inherent in Windows that requires the draconian centralized management that you describe.

            If you want to write an article about the benefits of Unix vs. Windows in distributed management, I'll probably agree with it. But what you descrbed in this article relates to management, not system administration.

            Carl Rapson
            rapson
          • Sure there is

            It's called the client server model and you can't run Wintel without it - and you can't decentralize it either without creating an uncontrollable cost sink - or did you think people were using citrix and locking desktops for fun and productivity?
            murph_z
          • Excuse me?

            [i]"People using wintel cannot do distributed management - not for cost reasons, not for security reasons, and not for simple "getting it to work" reasons. Control centralization is imposed by the technology, do anything else and your costs go up and up ..."[/i]

            You're kidding, right? Oh wait, you're Paul Murphy. I guess not.

            Distributed management is one of the best strenghths of Active Directory. You not realizing it, or understanding how it works doesn't maken it not so.

            [i]"Basically, with Unix you can, with Wintel you cannot."[/i]

            Replace "you" with "I", look in the mirror and repeat it a few hundred times, and hopefully you'll get the message.
            toadlife