Same old - Same old - and we're not learning from it

Technology changes every day - but management methods and ideas don't, so when we compare what we do today to what we did ten and twenty years ago, it's no surprise that only the technology seems to have changed.

One of the more interesting ways of viewing what we do in IT starts by looking at the long term - roughly the ninety years or so since data processing became important to leading edge American businesses. In some ways the continuity is amazing: during the 1920s, for example, the high cost of tabulators drove a focus on system utilization as the pre-eminent measure for management success in data processing - and that same measure is the single most important factor driving today's infatuation with ghosting.

In effect what's happened in data processing is that management rigidity has simply absorbed generations of technology change with the effect that if Forever Young's Captain Daniel McCormick had been a senior data processing manager he wouldn't have recognised today's technology but would have felt sufficiently at home with the CoBIT framework this stuff runs under to go right to work in any big IBM shop.

The science based computing side has also largely followed one management model since the earliest 1930s and 40s experiments aimed at extending human abilities to communicate, calculate, and remember. That model, however, reflects the scientific heritage rather than a corporate financial control agenda and is based on individuals or small groups using what others have made available to advance the state of the art - thus open source in the 1930s meant that both Zuse and Atanasoff freely distributed their work.

By the late 1960s, however, these two groups were in collision as people trying to commercialize science based computing found markets for their applications on both super computers and mini-computers with the former shipping mainly to other academics and the military and the latter going mainly to departmental managers in commercial and government organizations.

One aspect of how this worked out in the late 70s and early 80s looked like a bit of a free for all as data processing heads tried, mostly successfully, to get control of the mini-computer explosion by centralising processing in the glass house. And out of that we get, as one strand (and motivation) among many, users cynically accepting the lack of software for the IBM PC in order to leverage data processing's commitment to IBM in setting up thousands of the things as counterweights to the organisational arrogance inherent in data processing's management certainties.

Today data processing and the Windows culture that grew up in response to the complexities and limitations of the PC have essentially merged - with the worst of both worlds compromises involved epitmized by HP's successful introduction of a user PC kept locked in a data center rack with only the keyboard and screen on the user's desk.

What I think we're also seeing, however, is a bit of a re-run from the sixties and seventies prelude to the all out battles for control that took place in the late seventies and early eighties.

Some things have changed, of course: now IT management is aware there's life outside the glass house and almost any CIO can mouth business platitudes with the best of them, but the key drivers for user revolt are the same: differences in job focus, unhappiness about perceived waste, the siren call of more efficient, effective, and controllable solutions blocked by IT intransigence.

Although there's been tactical change, user strategies are broadly the same too: keep the cost below the radar, use available resources where possible, marginalize data processing's people and roles, defend actions on the basis of business need and flexibility, take every opportunity to weaken senior management's support for the IT guy.

It's on the tactical change side that I see a key trend emerging: user managers are using corporate IT's commitment to Wintel to get control of local hardware no longer considered sufficiently capable to support data center objectives - and then using local resources to put open source application, and thus usually Linux, on them.

In some ways what's going on resembles a replay of what happened when user management leveraged IT's commitments to IBM to bring in thousands of basically useless PCs, except this time it's IT's commitments to continual Windows upgrades that's being leveraged to free up boxes to run invisible, but valuable, Linux applications.

So if you've just slept for thirty years here's the bottom line: welcome back, if you were senior enough in IT to have your hands off the technology then, you can go right to work now, because, beyond vocabulary, very little has changed.


You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All