Open source developers provide 'glimmer of hope'

Open source developers provide 'glimmer of hope'

Summary: A software design expert claims that software quality is declining everywhere except in the open source world, but others insist that proprietary software is just as good as the alternative

SHARE:

An eminent software developer has claimed that the pressure to be first to market with new technology is leading to a decline in software quality, but that standards are higher in the open source world.

James Coplien, a software design expert who currently works as an object architect at US-based software company DAFCA, said in an interview at the ACCU conference in Oxford, that unless consumers start demanding better quality software, the software industry is unlikely to change.

"There's a pressure that unless you're one of the first three players in the market you don't have a chance," said Coplien. "Quality is suffering for time — people pay money for the first, not the best. It comes down to the fact that consumers are willing to put up with crap systems that crash all the time."

Coplien said the only area of the industry where people still take pride in the quality of the software they deliver is the open source community.

"The one glimmer of hope is the people who've said, 'Screw the industry, we're going to write excellent software and give it away', in other words, the open source movement," said Coplien. "I take off my hat to these people. Linux is one of the highest quality pieces of software out there."

There are various reasons why open source software is of better quality than proprietary software, according to Coplien. He claimed the collaborative effort of open source contributors, combined with a core group of developers, is the best way to build a secure IT system.

"Security is a system concern — it is a complex system," said Coplien. "How does nature deal with complex systems? Each cell does its own thing. The complementary, independent, selfless acts of thousands of individuals [in the open source community] can address system problems — there are thousands of people making the system stronger. If it was uncoordinated it wouldn't work, but there is a core of developers at the centre."

But other industry experts at the ACCU conference disagreed that open source code is superior to closed source code. Bjarne Stroustrup, who currently works as a professor at Texas A&M University and is the creator of C++, said that the quality of open source software is not necessarily any better.

"Open source is a good idea, but not all open source code is good," said Stroustrup. "Some of the best code in the world is not open source."

"For example, I would dearly love to have a good look at the [proprietary] code running in the Mars Rover. It has to be good — it's been running on Mars for 15 months and has to be debuggable remotely."

Coplien argues that open source software is better tested than closed source software as there are "more eyes" looking at it, and people are encouraged to find bugs. "If I can find a bug in Linux, it’s a lifetime accomplishment," said Coplien. "In the Linux community it is a badge of honour to find a bug," he said, adding that open source developers are under pressure to write superior code because they know it will be seen by many other coders.

But the security of open source software is a controversial issue. Linux kernel co-maintainer Andrew Morton said this week that a lack of 'credit or money or anything' for those who test the open source OS could threaten its long-term stability.

And speaking at the ACCU conference, Ross Anderson, professor of security engineering at Cambridge University, said that open source software is not inherently more secure than closed source software, as although users can find and fix vulnerabilities more easily when the code is available, this will also help those attacking the software.

But, if asymmetry is introduced, which gives attackers or defenders an additional advantage, this will affect the relative security of open and closed source software, according to Anderson. Factors that could reduce the relative security of closed source software include commercial influences, where a company does not fix a bug due to the cost, or PR influences, where a company tries to hide information on a bug to prevent negative publicity, said Anderson.

Anderson's research on this issue is available as a PDF file from the Cambridge University Web site.

Topics: Apps, Software Development

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

6 comments
Log in or register to join the discussion
  • I was at a presentation at OSCon last summer, where team members of the Mars Rover program demonstrated their software. It's mostly built upon Linux and other open source platforms. So the quote from the person who disagreed actually supports the assertion about the quality of open source software.
    anonymous
  • I think that Coplein assumptions about software quality are right on target.

    What springs immediately to mind are the two software platforms that I use: Windows and Linux. Odd that the windows desktop starts to crawl after a mere few days of running 7x24 while the Linux desktop chugs away without a degradation in speed for months at a time.

    Now before you raise the issues about any sort of windows spyware / malware infections, the machine is clean. It's been checked by MS AntiSpyware, Adaware, Spybot as well as f-secure. So that's not it.

    Before you go and raise the issue of performance optimizations such as defragging the hard drive, registry, or paging file, these are all already done. The registry files are each in a single fragment, and the same is for the pagefile.

    So what is deal? I figure that there must be a resource leak in the OS, or some of the DLLs that the application are using that is causing the slow down. This is in fact supported by the task manager. After a few days, the virtual memory consumption is over 1 GB, while when freshly booted, the OS only uses around 400 MB.

    Other indications of software quality? Well the if the windows box does a lot of heavy network traffic for a long time, slowly but surely, the network connection degrades until if becomes unusable. At this point you have to reboot the machine. Similar situation in Linux? Nope. Linux runs under heavy network load for months at a time without a hitch or slow down.

    Conclusion: Microsoft has some software quality problems. Like you didn't know that?

    Right, and instead of fixing the problems, they come out with more overly complex code to do even more fancy needless stuff, like all the glitz and clamor of the new Longhorn interface. Just another source of continued resource drain.
    anonymous
  • Yeah yeah yeah ... recenteltly MySQL and PHP had the worst bugs and automated worms defacing websites using them !!

    If you think "open source" means "quality" or "security", than you don
    anonymous
  • Very simple but very true : pressure to deliver drives quality down because users will usually stick to the first soft they get to use. I see that all the time in my industry (financial risk software). Still, the same problem will occurr if more open source developpers end up being commercially employed and motivated.

    Open source suffers from being open source. You get to use plenty of software that is not ready, simply because the pull of new functionality drives a lot of people to upgrade to pre-release quality stuff. But when it's ready, it's ready.
    anonymous
  • Discussing "quality" without defining what it is is quite a useless exercise. Just counting bugs per line is one thing, Open Source is a clear winner there. If code usability is considered, proprietary code comes out a lot better. Then again, "usability" depends on the user, and what is indispensable for some users, amy be useless for others.

    I run dual-boot PC at home with Slackware Linux as my primary OS, but I must say it is a real pain to get some of the things, like printing, scanner and camera working. On Windows 2000 it works first time, all the time. I prefer Linux because I am in charge of what it does and what it doesn't, and it is more stable in performance sense, while W2K tends to slow down with time. I have to say W2K never crashed on me so far, while Linux I have to reboot once in a while because of a runaway printing process hung on USB. I also have much greater variety and choice of software on Linux, also most of the times the "choice" means I have to pick one decent program out of 200 primitive excercises in programming, investing my time in research, trial and error as opposed to investing my money in just buying it. I like it more investing my time than my money, but that's my choice, and I can see that most people will prefer to invest money, not time.
    anonymous
  • The Mars Rovers use VxWorks. Go buy a source
    licence. It isn't rocket science.
    anonymous