The ugly episode of Heartbleed has put OpenSSL under more scrutiny than any open source software project ever. At a certain level of scrutiny perhaps any program will look bad, but OpenSSL's on the hot seat because it's OpenSSL that failed in its mission. It's hard to construe these matters in a way that makes OpenSSL or the open source nature of it look good.
But who is this "OpenSSL"? When something goes wrong with a product people want to know who is responsible. Many will be shocked to learn that it's all run by a small group of developers, most volunteers and all but one part-time. Huge parts of the Internet, multi-zillion dollar businesses, implicitly trust the work these people do. Why?
Let's stipulate that OpenSSL has a good reputation, perhaps even that it deserves that reputation (although this is not the first highly-critical vulnerability in OpenSSL). I would argue that the reputation is based largely on wishful thinking and open source mythology.
Before the word "mythology" gets me into too much trouble, I ought to say, as Nixon might have put it, "we're all open source activists now." For some purposes, open source is a good thing, or a necessary thing, or both. I agree, at least in part, with those who say that cryptography code needs to be open source, because it requires a high level of trust.
Ultimately, the logic of that last statement presumes that there are people analyzing the open source code of OpenSSL in order to confirm that it is deserving of trust. This is the "many eyeballs" effect described in The Cathedral and the Bazaar, by Eric Raymond, one of the early gospels in the theology of open source. The idea is that if enough people have access to source code then someone will notice the bugs.
This is, in fact, what has happened with Heartbleed... sort of. Heartbleed was discovered by Neel Mehta, a security researcher at Google. If you look at the vulnerability disclosures coming out of other companies, Apple and Microsoft for example, you can see that Google spends a lot of time scrutinizing other people's programs. They're like no other group in this regard.
But it took Google two years to find it. In the meantime, Google finds lots of security problems in Apple and Microsoft products for which they have no source code. This is because in the time since the formation of the "many eyeballs" hypothesis, there have been huge improvements in testing and debugging tools. Some computer time with a marginal cost of $0 is worth thousands of very expensive eyeballs.
I'd go so far as to suspect that the availability of source makes developers and users discount the necessity of testing that is common on commercial software. I wouldn't be surprised if a static source code analyzer would have found the Heartbleed bug, flagging it for possible buffer over/underrun issues. Heartbleed might also have been found by a good round of fuzzing.
As I said recently, some programs are so critical to society at large that someone needs to step in and make sure they are properly secured. Obviously the problem is money. So why, when this program is so critical, is it being run like it's public TV? Yes, like Blanche DuBois, OpenSSL has always depended on the kindness of strangers.
Are OpenSSL users living in a dream world, like Blanche? There's some truth to this. Is OpenSSL going to crack under the pressure, like Blanche? No, it's not that bad.
The dream world part of it comes from the notion that software is more secure for being open source. Even if this was ever true, and I'm not sure it was, it's surely not anymore.
How about closed source? Is it more or less secure because source code is not generally available? As a general rule, I'd say the answer is no. The key, just as with open source, is how much time is spent by qualified people auditing and testing it. Windows and other major Microsoft products get a ton of attention from outside testers, both black hat and white hat.
Is that same research being performed on major open source projects? Not to the same degree, at least as far as I can see. A large percentage of the vulnerabilities fixed by Apple and Microsoft are reported to them by research companies with bug bounties. I scanned the list of vulnerability reports at HP TippingPoint's Zero Day Initiative, VeriSign iDefense, security-assessment.com and KeenTeam and I don't see any reports about open source projects. Lots of big name software from Microsoft, Adobe, Oracle, Apple, and the like, but no libpng, no Apache, no MySQL, no PHP.
When vulnerabilities are reported in such programs, it's typically from independents. You can see this by studying Apple's disclosures. You'll see the same in the security updates from Apache although, from what I can tell, most large open source projects don't make it easy to find a list of their security updates and the vulnerabilities fixed in them.
"Cathedrals" like Microsoft and Apple have another advantage in situations like Heartbleed: patch delivery and installation. All Windows users can get their updates from Microsoft and Apple and even consumers have simple, yet sophisticated systems for installing them. There are undoubtedly many OpenSSL installations in forgotten computers, appliances, perhaps even embedded devices, and a fat chance that they'll be updated ever.
So I think it's fair to say, to a point, that OpenSSL benefited little, with respect to Heartbleed, from being open source. It's also fair, to a point, to say that it suffered from being open source if that made it easier not to test the software thoroughly as it should have been. The solution isn't to close the source code, it's to recognize the limitations of open source and test programs like OpenSSL as if no source were available.
For the moment there's little we can do about this. It's virtually impossible to use the Internet without relying on the work of numerous open source projects, the principals of which are unknown to all but a few people and the development standards for which must vary wildly.
Perhaps the conclusion we should all draw is that it works; we wouldn't be using it if it didn't. That's a pretty low bar of accomplishment, and in fact all it means is that it looks like it works. I don't like the idea of entrusting my business to software for which nobody can be held responsible, even if the source code is freely available, but for now I have little choice, and neither do you.