X
Business

Eric Raymond: Proprietary software is doomed

In part two of his ZDNet UK interview, the co-founder of the Open Source Initiative explains why the days of proprietary, centralised software development are numbered
Written by Matthew Broersma, Contributor
Q: Red Hat's Bob Young argues that Linux will never take over the desktop, but that it will make the desktop largely irrelevant by controlling the Internet back-end. What are your views on the desktop debate?
A: I think Linux will take over the desktop, and I think the reason it will doesn't have much to do with whether we clean up and polish our interfaces or not. Linux will take over the desktop because as the price of desktop machines drops, the Microsoft tax represents a larger and larger piece of OEM margin. There's going to come a point at which that's not sustainable, and at which OEMs have to bail out of the Microsoft camp in order to continue making any money at all. At that point, Linux wins even if the UI sucks. And frankly, the UI doesn't suck. It's not perfect, it's got a few sharp edges and a few spikes on it, but so does Windows. We broke through the $1,000 (£700) floor some years back. But my threshold figure for when Microsoft isn't viable anymore is when the average desktop configuration drops below $350. I got that figure by looking at the position of Microsoft in the market for PDAs and handhelds. Above $350, Windows CE has some presence, largely because Microsoft is heavily subsidising it, but below $350, Microsoft is nowhere. And the reason is very clear: if your unit price is that low, you can't pay the Microsoft tax and make any money. We're heading toward the point where consumer desktops are available at that price. Some of the low-end PC integrators are already there, outfits like E-machines and so forth. Microsoft has tried to co-opt interest in open source with its "shared source" initiative. Is that going to work?
I don't see any signs that that's changing anybody's minds. I don't see anybody in the press saying "That's wonderful! In fact it's so wonderful it will swallow XP's licence restrictions, it will swallow .Net and Passport..." It isn't happening. Is it just a PR move?
It goes deeper than that. Everything in Microsoft's strategic behaviour for the last two years, as far as I'm concerned, can only be accounted for by the hypothesis that they know their packaged software business is doomed. They're moving from a product base where selling Windows CDs is their major revenue stream, to where they're telling everybody where they want to be is in a business where they're the world's biggest ASP. Now people haven't really thought about this, but being an ASP is harder than being in a product business. It's more difficult: the staff requirements are more demanding, the margins are lower. Why would Microsoft go from being in an easy business to being in a hard business? I think the right answer is that they know the easy business is doomed. Bill Gates said as much in his famous 1995 email saying the Internet was the future.
They have a strategic problem, which is that somehow they have to make the transition to a Passport and .Net business model before Wall Street figures out that their current business model is screwed. If the investors figure that out before they've changed horses, then they're going to discount the future value of the stock, and the whole financial pyramid that Microsoft is built on will just collapse. I wouldn't be sleeping too well if I was a Microsoft strategist right now, because that's a really hard job, especially considering that they don't even have the technology in place for the new business model yet. Even if they had the technology in place, they would have a very hard job persuading corporate managers to buy into this, simply because of the control issue. If I have all my business processes farmed out to an ASP, I don't control them any more. It's not just a matter of being dependent on somebody else's downtime as well as my own. How do I know that my core business secrets are still protected? Speaking of security, the Internet Engineering Taskforce (IETF) recently released a draft protocol for reporting security flaws in software, which was criticised by some people as being too slanted in favour of the software industry.
That was very good, that was very well done. I skimmed it and I didn't feel that way. I remember reading it and thinking that they had chosen the time-outs for reporting requirements just about right. They chose just about the same time-outs I would. Is there a danger of software companies exercising too much control over how and when software bugs are reported?
There's the obvious threat from the DMCA, if that kind of control is written into the licence, but under current software licences they can't control that kind of disclosure. And in fact if they tried, they'd probably run into serious legal problems. So I don't see that as a major issue. I'm not worried about that for two reasons. One is that there are very articulate and capable people who have press exposure and credibility in the security community, who are prepared to go out there and say, "full disclosure is the only way you can get decent security" -- I'm thinking for example of Bruce Schneier at Counterpane (Internet Security). He's done an excellent job of educating the trade press on this, and there are other people who are almost as capable as he is in that way. So I think they'll keep that issue alive. Also, one of the reasons I'm happy about that RFC (request for comment) you just mentioned is because anyone who comes under corporate pressure not to report bugs, can point at that RFC and say, hey, this is Internet best practice here, so get off my back. Would the IETF proposal make any difference?
In that political sense, yes. I don't think that draft RFC does anything more than just slightly formalise the unwritten guidelines that already exist, as witnessed by the fact that they chose the same time-outs that I would have (laughs). Managers have a superstitious respect for documentation and procedures, so being able to point at a document does help. How is the open-source movement different today than, say, 1999?
I think we're more sober now than we used to be. There was a period during the dot-com boom in '99 when I think a lot of people were in some danger of getting distracted by the prospect of lots of easy money. And of course that prospect has gone away now, which is all right if that has the effect of re-concentrating us on the work. I think also we have a lot more credibility in the global 1,000 and the business press than we had in '99. We've gotten more success stories under our belt. We've got more people who've considered the pro-open-source argument carefully and decided they agree with it. As witnessed by what happened last year when there was some danger that Microsoft was going to go into a full-bore propaganda campaign against us. If they had done that in mid 1998, just after the Mozilla disclosure, they might have buried us. I was worried about that. I was seriously worried that that was a possibility, that they would turn on the hype machine before we had enough success stories and enough corporate backing to be able to counter that. What happened in early 2001 demonstrated that we had already achieved enough mainstream cred and recruited enough backers inside the establishment, as it were, that when Microsoft tried it it just bounced. And that's a significant difference from '99. Mainstream credibility is important to you and the OSI, isn't it?
The thing that I've always kept in mind, and the reason I founded the OSI in the first place is this: if you want to change the world, you have to co-opt the people who write the cheques. Maybe it sounds pretentious to say this, but most of the people who do this mostly care about art, not about money. If that weren't the case they'd be off doing something else. Mind you, I'm not saying that it's necessarily better to care about art than about money, I'm just making an observation about the motivations of the people who do this. What's the future for the "bazaar" open-source model?
I see that continuing to succeed, in a way that's separate from the debate about business models. The reason I'm very sure that will be the case is because of the scaling problems that software development is having as machines grow more capable and software grows more complex. The fundamental problem here is that machines roughly double in capability every eighteen months, and as you know, the size of the average software project in lines of code tends to be double that. That's a real problem, because bugs generally arise from unanticipated interactions between different pieces of code in a project. And that means that the number of bugs in the project tends to rise with the square of the number of lines of code. That means that as projects get larger, and their bug density increases, the verification problem gets worse, and it doesn't get worse linearly, it gets worse quadratically. The reason I'm confident that the bazaar model, the open-source model, will continue to thrive and claim new territory, is because all of the other verification models have run out of steam. It's not that open sourcing is perfect, it's not that the many-eyeballs effect is in some theoretical sense necessarily the best possible way to do things, the problem is that we don't know anything that works as well. And the scale of problems with other methods of QA (quality assurance) is actually increasing in severity as the size of projects goes up. On the other hand, open-source development, open-source verification, the many-eyeballs effect, seems to scale pretty well. And in fact it works better as your development community gets larger. If you want to go to a really fundamental analysis, what we're perpetually rediscovering on a scale of complexity is that centralisation doesn't work. Centralisation doesn't scale, and when you push any human endeavour to a certain threshold of complexity you rediscover that. That recalls the argument of a few weeks ago about whether Linus Torvalds should get an assistant.
That's another illustration of the problem. Centralisation doesn't scale even when the centre is Linus. Back to page one: Why open source will rule
For all your GNU/Linux and open source news, from the latest kernel releases to the newest distributions, see ZDNet UK's Linux Lounge. Have your say instantly, and see what others have said. Go to the Linux forum. Let the editors know what you think in the Mailroom.
Editorial standards