Last week's talkback discussions on what Linux needs to do to gain significant market share produced a wide range of interesting ideas -including a few I hadn't thought about before. In response I plan to try to produce a synthesis - a longer article that pulls it all together and offers up some ideas for further discussion - but that will take some time.
Meanwhile, one of the side issues that came up was network computing - - big servers, smart display desktops. I'm personally of the belief that this is a big part of the right answer for most business computing but readers like Carl Rapson think otherwise.
In one discussion thread Anton Phildor makes a comment about who should be in charge - IT or the user, saying "... some geek off in IT .... who restricts users has in fact taken away control."
That drew a response from "Jimbo:"
It's not so much that the geek in IT has control, as it is that
1. You have to call the help desk and listen to muzak for 15 minutes
2. You have to explain to the geek's front guy why you want to get a change done
3. You get to listento muzak for another 15 minutes while he checks with the powers that be.
4. You get to submit a work order
5. You wait for someone to respond to your request
6. You have to call your manager and have him make the request because you aren't authorized.
7. Your manager forgets to follow up
8. You get the idea
To which I posted this response:
Yep - it's mainframe days all over again
I maintain that the MS client-server architecture has become what it started as a rebellion against - 327x (now PC) terminals for which the desktop user has responsibility but over which he/she has no control. The PC is prettier, but every bit as dumb.
And Carl Rapson added:
...that is more a function of the bureaucratic nature of large corporations than it is of the particular platform. The same (or similar) steps would have to be followed if the user had a dumb terminal on his desk. Plus, the user would have less personal control, over his computing environment.
Don't let your bias against Microsoft trick you into thinking the problems with corporate thinking are caused by Microsoft.
Then, a bit later, he added:
If the mainframe approach was bad twenty years ago, why is it good now? After all, your "smart display" approach is basically the same old mainframe-dumb terminal apperoach, repackaged. So what's changed? How does it empower the user (the beginnings of this thread)?
And I have to snicker at your book title - The Unix Guide to Defenestration. Ostensibly about "throwing out what doesn't work", I think it's pretty obvious what you're advocating "throwing out" (considering what "defenestration" means).
With the exception of that fact that I don't actually dislike Microsoft or Windows - I just don't waste time struggling with things that don't work and question the sanity of those who do - I agree with everybody here, myself included.
So how do I reconcile such obviously opposing views? by saying that Carl doesn't know what he's talking about - literally. I couldn't agree more with what he says about the mainframe architecture having been dumb twenty years ago, and, by extension being even dumber now. Nor can I disagree with "Jimbo" or defend the looniness he reports - they're both absolutely right, and so is Anton. But, and here's the point: the Unix business architecture - big servers, smart displays on desktops- isn't the dumb terminal world come again. In fact, it's pretty much the exact opposite.
People often assume it is - they see servers, they see terminals, they hear "thin client," and they make assumptions about how these go together, but those assumptions are wrong.
So please, stay tuned: answering the question will take all week starting with tomorrow's blog about how the mainframe architecture developed, why it's so bad, and how the modern locked down corporate desktop replicates it.