X
Tech

Kernel hacking, politics, and computer hardware

A recent interview with Con Kolivas, former Linux kernel hacker, makes for an interesting read. Mr.
Written by John Carroll, Contributor

A recent interview with Con Kolivas, former Linux kernel hacker, makes for an interesting read. Mr. Kolivas became well known for some of his performance-related patches, and oddly enough, he did all this as a HOBBY, without any previous experience in computer programming. His day job is as an anaesthetist in Melbourne, and he is self taught as a programmer. This self-taught programmer with an Economics degree from a liberal arts university can appreciate that.

Mr. Kolivas is proof of the power of the community software development model, and is reason that Microsoft would do itself a lot of good by releasing more source code. Not all of it, mind you, as proprietary software is a good revenue model, but finding a way to harness this energy, or else use wholly community-developed projects within Microsoft, can only benefit the company.

Diversion aside, the article is interesting for the insight into the politics of community development. The Linux kernel mailing list is the standard way to communicate with kernel developers. Apparently, you need to be a brave soul, one immune to the slings and arrows of critics, to post to that list. "Noobs" are chosen for special humiliation, so its important to know what you are doing before you post.

Politics among kernel hackers is an issue, and played a big part in his decision to stop kernel hacking (not that I would know anything about such things, as Microsoft is completely politics-free...I am kidding). He feels that desktop Linux-related issues are given short shrift compared to the needs of server-side Linux, where Linux is currently the most popular. Dealing with those battles took a toll on him personally, so he chose to spend more time on other pursuits. That's what you can do when it's a hobby, just as he has the option to change his mind at the drop of a hat.

One area of curiosity, at least for me, was his dislike of the hardware standardization that crept into the industry, largely due to the growing dominance of Microsoft Windows. Quoting Kolivas:

In the late 1980s it was a golden era for computing. There were so many different manufacturers entering the PC market that each offered new and exciting hardware designs, unique operating system features and enormous choice and competition. Sure, they all shared lots of software and hardware design ideas but for the most part they were developing in competition with each other. It is almost frightening to recall that at that time in Australia the leading personal computer in numbers owned and purchased was the Amiga for a period.

...

Hardware has since become subservient to the operating system. It started around 1994 and is just as true today 13 years later. Worse yet, all the hardware manufacturers slowly bought each other out, further shrinking the hardware choices. So now the hardware manufacturers just make faster and bigger versions of everything that has been done before. We're still plugging in faster CPUs, more RAM, bigger hard drives, faster graphics cards and sound cards just to service the operating system. Hardware driven innovation cannot be afforded by the market any more. There is no money in it. There will be no market for it. Computers are boring.

In other words, hardware standardization was a BAD thing because it limited innovation in the hardware space. I wonder, however, if he would make the same argument vis a vis standard protocols, such as HTTP or TCP/IP, or with respect to file formats, such as PNG or ODF?

Granted, computer manufacturers aren't popping up with new hardware architectures on a regular basis, at least those that target the desktop space. You still see newer architectures in the embedded space, though some degree of standardization has crept into that space as well.

The reason standardization occurs, however, is that it serves to dramatically lower costs. Economics of scale apply here. You can build a machine for a lot less if you build 1 billion of them than if you build 1 million. Over time, these costs go down, because all the energy put into a particular hardware architecture yields efficiency gains over time.

Practically every mainstream desktop computer operating system, Linux and Mac included, use this standard computer architecture. This has enabled prices to drop precipitously over time, allowing computers to cost as low as $300 (impossible in the 1980s), or in the case of the Mac, generate even higher profit margins with the shift to an Intel architecture (come on, Mac fans, you know your hardware is more expensive).

Yes, software has sunk hardware, but that has certain benefits that appeal to most consumers, democratizing computing and ensuring that an ever growing numbers of users could own one. Innovation still exists, but you just get less wild divergence...except in the embedded space, and even there, standardization exists.

That seems an acceptable trade-off. Costs do matter. Ferraris might have some interesting engine architectures, but I'm more likely to buy a Honda Civic.

Editorial standards