X
Tech

Want secure software? Listen to Marge Simpson

We can't afford software with a philosophy, no matter what the platform zealots reckon about security. We need reliable engineering.
Written by Stilgherrian , Contributor

Look, what's wrong with you people? No sooner did we have to wallop Apple with the Baseball Bat of Clarification — once it had first been studded with all manner of spiky questions about how a deeply flawed chunk of security software managed to get into production code — than we have to wield the whacker once more against the wacky world of Linux. Sheesh!

Five points to the folks responsible for the GnuTLS library for conducting the code audit that eventually found this embarrassing problem, but minus a couple hundred points for it not being discovered for what looks like eight years or so. But I'm sure nothing would have gone wrong in the meantime, right? Sure.

At this point, I suppose I could take the efficient path. I could just grab my previous column, search for "Apple", and replace it with "Linux", and it's mission accomplished. "Linux's goto fail needs a massive culture change to fix", that's the new headline. Yeah? No.

The problem with doing that — apart from the culture change that's required being subtly different in each case, and apart from it being something my editor would frown upon — is that it makes the whole thing look like it's about platform-based tribal zealotry. Far too much of the reaction to my column, both here at ZDNet and elsewhere, was much like the thousands of words that have been written about Apple and Linux in recent days, which never rose above the "Yah boo sucks!" level of analysis.

Explanation by slogan and buzzword? There's no science in that. That serious TLS/SSL bugs existed at all in these major operating systems for unknown lengths of time simply proves that the code wasn't tested properly prior to release. That needs to be fixed. There are lessons to be learned. And no amount of tribal posturing should be allowed to distract us from that point.

When it comes to sourcing our security software, the great analyst Marge Simpson was right: "We can't afford to shop at any store that has a philosophy" — whether that philosophy is about being designed by Apple in California, or many eyes, or freedom, or whatever hand-waving feelpinions people might proffer.

No, we don't need a philosophy so much as need need science — or, more accurately, engineering.

Many developers like to be called "software engineers", but it seems to me that there's precious little actual engineering going on. "The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software"; that's how the IEEE defines software engineering, yet it strikes me that many, many software projects are missing quite a few elements of that definition.

Security vendors, where is your evidence — with verifiable numbers, if you please — that spending money on your new products will be demonstrably better for my organisation's risk profile than some other approach, such as using a competitor's approach or maybe even doing nothing at all?

After all, Verizon's respected Data Breach Investigations Report (DBIR) continues to show that we're not getting any better at keeping the bad guys out. The Australian Signals Directorate (ASD), formerly the Defence Signals Directorate (DSD), continues to update its Strategies to Mitigate Targeted Cyber Intrusions, which continues to show the same four things that'll solve 85 percent of your problems in that area — none of which really need you to buy anything. Same for the SANS Institute's equivalent list. And as SANS Institute director of research Alan Paller said a year and a half ago, measured risk reduction achieves real results — and, yet again, that's mostly about process, not technology.

It was also back in 2012 that I noted what I called a dangerous disconnect in cybersecurity. Vendors of all stripes have been telling us to record more and more and more stuff, so that we can be more certain of finding The Bad Guys — essentially turning every organisation into a petty-despot clone of the NSA. The main thing holding us back was pesky privacy law, according to RSA executive director Art Coviello. But practitioners keep saying it's really about the human factor, about getting everyone in the organisation to understand the potential risks, about developing a security culture — and, again, that's something you can't buy.

There was an echo of this disconnect in whistleblower Edward Snowden's presentation to SXSW Festival the other day, although he was talking about the NSA rather than private organisations' data collection. "By squandering precious, limited resources on 'collecting it all', we end up with more analysts trying to make sense of harmless political dissent and fewer investigators running down real leads," he said.

Of course, writing new code is much more fun than code reviews, unit testing and audits. Data mining and analysis and playing counter-intelligence against the hackers is much more fun than patching servers and telling the boss yet again that he shouldn't click through to the "shocking video".

But this is the culture that needs changing, regardless of the platform. It's not about being cool or fighting for freedom. It's about protecting us from threats. If science and engineering can help us do it better, that's a good thing — even if it isn't as much fun, or won't sell as many units.

Editorial standards