Want secure software? Listen to Marge Simpson

Want secure software? Listen to Marge Simpson

Summary: We can't afford software with a philosophy, no matter what the platform zealots reckon about security. We need reliable engineering.

SHARE:

Look, what's wrong with you people? No sooner did we have to wallop Apple with the Baseball Bat of Clarification — once it had first been studded with all manner of spiky questions about how a deeply flawed chunk of security software managed to get into production code — than we have to wield the whacker once more against the wacky world of Linux. Sheesh!

Five points to the folks responsible for the GnuTLS library for conducting the code audit that eventually found this embarrassing problem, but minus a couple hundred points for it not being discovered for what looks like eight years or so. But I'm sure nothing would have gone wrong in the meantime, right? Sure.

At this point, I suppose I could take the efficient path. I could just grab my previous column, search for "Apple", and replace it with "Linux", and it's mission accomplished. "Linux's goto fail needs a massive culture change to fix", that's the new headline. Yeah? No.

The problem with doing that — apart from the culture change that's required being subtly different in each case, and apart from it being something my editor would frown upon — is that it makes the whole thing look like it's about platform-based tribal zealotry. Far too much of the reaction to my column, both here at ZDNet and elsewhere, was much like the thousands of words that have been written about Apple and Linux in recent days, which never rose above the "Yah boo sucks!" level of analysis.

Explanation by slogan and buzzword? There's no science in that. That serious TLS/SSL bugs existed at all in these major operating systems for unknown lengths of time simply proves that the code wasn't tested properly prior to release. That needs to be fixed. There are lessons to be learned. And no amount of tribal posturing should be allowed to distract us from that point.

When it comes to sourcing our security software, the great analyst Marge Simpson was right: "We can't afford to shop at any store that has a philosophy" — whether that philosophy is about being designed by Apple in California, or many eyes, or freedom, or whatever hand-waving feelpinions people might proffer.

No, we don't need a philosophy so much as need need science — or, more accurately, engineering.

Many developers like to be called "software engineers", but it seems to me that there's precious little actual engineering going on. "The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software"; that's how the IEEE defines software engineering, yet it strikes me that many, many software projects are missing quite a few elements of that definition.

Security vendors, where is your evidence — with verifiable numbers, if you please — that spending money on your new products will be demonstrably better for my organisation's risk profile than some other approach, such as using a competitor's approach or maybe even doing nothing at all?

After all, Verizon's respected Data Breach Investigations Report (DBIR) continues to show that we're not getting any better at keeping the bad guys out. The Australian Signals Directorate (ASD), formerly the Defence Signals Directorate (DSD), continues to update its Strategies to Mitigate Targeted Cyber Intrusions, which continues to show the same four things that'll solve 85 percent of your problems in that area — none of which really need you to buy anything. Same for the SANS Institute's equivalent list. And as SANS Institute director of research Alan Paller said a year and a half ago, measured risk reduction achieves real results — and, yet again, that's mostly about process, not technology.

It was also back in 2012 that I noted what I called a dangerous disconnect in cybersecurity. Vendors of all stripes have been telling us to record more and more and more stuff, so that we can be more certain of finding The Bad Guys — essentially turning every organisation into a petty-despot clone of the NSA. The main thing holding us back was pesky privacy law, according to RSA executive director Art Coviello. But practitioners keep saying it's really about the human factor, about getting everyone in the organisation to understand the potential risks, about developing a security culture — and, again, that's something you can't buy.

There was an echo of this disconnect in whistleblower Edward Snowden's presentation to SXSW Festival the other day, although he was talking about the NSA rather than private organisations' data collection. "By squandering precious, limited resources on 'collecting it all', we end up with more analysts trying to make sense of harmless political dissent and fewer investigators running down real leads," he said.

Of course, writing new code is much more fun than code reviews, unit testing and audits. Data mining and analysis and playing counter-intelligence against the hackers is much more fun than patching servers and telling the boss yet again that he shouldn't click through to the "shocking video".

But this is the culture that needs changing, regardless of the platform. It's not about being cool or fighting for freedom. It's about protecting us from threats. If science and engineering can help us do it better, that's a good thing — even if it isn't as much fun, or won't sell as many units.

Topics: Security, Apple, Linux, Software Development

About

Stilgherrian is a freelance journalist, commentator and podcaster interested in big-picture internet issues, especially security, cybercrime and hoovering up bulldust.

He studied computing science and linguistics before a wide-ranging media career and a stint at running an IT business. He can write iptables firewall rules, set a rabbit trap, clear a jam in an IBM model 026 card punch and mix a mean whiskey sour.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

31 comments
Log in or register to join the discussion
  • And none of the problems would have been spotted without the source.

    So a philosophy IS still necessary.
    jessepollard
    • jessepollard: "philosophy IS still necessary"

      Consider the Debian Project which defaulted with OpenSSL up to Debian Squeeze and then switched to GnuTLS starting with Debian Wheezy. It was FOSS philosophy related to the GPL that prompted Debian to switch from OpenSSL to GnuTLS.

      Both OpenSSL and GnuTLS are open source projects.
      Rabid Howler Monkey
  • Eight Years Even With The Source

    Does not make it look good. Imagine if someone having the source, knowing the problem, silently exploiting.

    Heck, maybe the NSA didn't really need to order backdoors.
    madfry
    • Not used very much.

      Until rather recently I had never even heard of that particular library.

      Always used openssl.
      jessepollard
      • jessepollard: "Not used very much"

        Starting with Debian Wheezy (currently, Debian stable), GnuTLS replaced OpenSSL. Debian is the most widely-used base distro for deriving Linux distros. Ubuntu is derived from Debian and Linux Mint is derived from Ubuntu. These three distros are the most popular at DistroWatch by a country mile.

        And Red Hat, the employer of Nikos Mavrogiannopoulos, one of the two primary maintainers of GnuTLS, ordered the security audit of GnuTLS. GnuTLS is used in RHEL, the most widely-used, commercial Linux distro in enterprises.
        Rabid Howler Monkey
        • DistroWatch

          The DistroWatch Page Hit Ranking statistics are a light-hearted way of measuring the popularity of Linux distributions and other free operating systems among the visitors of this website. They correlate neither to usage nor to quality and should not be used to measure the market share of distributions. They simply show the number of times a distribution page on DistroWatch.com was accessed each day, nothing more.


          http://distrowatch.com/dwres.php?resource=popularity
          RickLively
          • Corroborating evidence ...

            The Linux Journal 2013 Readers' Choice awards:

            Best overall Linux distro:
            o Ubuntu 16%
            o Debian 14.1%
            o Arch Linux 10.8%
            o Linux Mint 10.5%

            Debian, Ubuntu and Linux Mint total 40.6%
            and toss in SolydK at 4.7%, Kubuntu at 3.8% along with Xubuntu at 2% to put Debian and Debian-based distros just over 50%.

            http://www.linuxjournal.com/rc2013

            Debian and Debian-based distros represent a large segment of Linux distros used outside of the enterprise.

            P.S. At least you didn't try to argue that RHEL wasn't the most widely used commercial Linux distro in the enterprise.
            Rabid Howler Monkey
        • Re; GnuTLS and Nikos

          Approved changes for Fedora 21.

          “System-wide crypto policy”

          “Unify the crypto policies used by different applications and libraries. That is allow setting a consistent security level for crypto on all applications in a Fedora system. The implementation approach will be to initially modify SSL libraries to respect the policy and gradually adding more libraries and applications.”

          Name:Nikos Mavrogiannopoulos

          https://fedoraproject.org/wiki/Changes/CryptoPolicy
          RickLively
      • The bug is further down in the stack than TLS/SSL...

        1. The real problem is that Microsoft has not implemented the complete tcp/ip stack
        and
        2. People write about these things as if they know the answers, and have never coded a C program. They speak high and low about "problems" and "security" like the nail will break one day. The problem is even worse because we now have very few programmers that has ever written a C program to make two computers communicate - initiate communication, detect failure, teardown - and still these claim they know how. There must be a serious change in attitude about skills, there is a need to know the fundamental basics, and not just the color of the boarder and the size of the buttons - which is what the journalists write about...

        Please free us from the ignorant!
        knuthf
  • Chip Architecture upwards

    Current security problems stems all the way from the ground up as modern computer systems are still built on ancient technology that was never designed to give security a high priority. The only way this will ever be better addressed is going back to the drawing board and re-designing from the ground up. Everything else is just patching a leaky boat that will always leak.
    Alan Smithie
    • Technology strategy

      The tcp/ip was made by the US DoD so that they could intercept messages and tap what they wanted.
      In addition to that, Microsoft tried to convert the world to using Xerox' proprietary protocols, made before tcp/ip but fully aware of the work at MIT. The banks still use OSI type of communication - because this is safe, and what made with security in mind. When you make a system a believe it is just you who are able to hack your way into it, you are naive and make unacceptable assumptions. Unfortunately, again and again has the Americans been led down the "Commercial path", and it seems to be a disease over there to trust in NSA, CIA, FBI and the local sheriff. But - now they blame their president - strange people. We detected that 25 to 35 of al transmission on the mobile network was by non-registered companies, or no phone company, because the "CLEC" code used to invoice companies for the network use was missing. That was more than 10 years ago. So we stopped that, and then the men in black turned up and demanded it be put back on. You needed a hacker to tell it, did not believe the engineers that made the telecom networks - well they did not say the same as your sheriff.
      knuthf
  • totally agree

    Many problems are repeated over and over to be discovered over and over because they are NEVER ENGINEERED OUT even though in most cases we already know how and have known for years. Clearly code review itself is not enough and is far to unreliable as so expertly demonstrated, that meme is dead. We know who Einstein was talking about.
    greywolf7
  • Right on.

    And it WILL get worse. The reason is we are rapidly heading to a 'NIX-based monoculture. Hell, even Microsoft is tacitly sanctioning an Android device - and being strongly advised to create an Android-Windows FrankenOS. Which would have precedence - can you say "Apple".

    But with monocultures come even more vulnerabilities. Especially since that monoculture was based on the knockoff rather than the original for the most part. We are not talking commercial-grade UNIX here, we are talking Linux. Great OS, but not the same pedigree. And if one of these eight-years-to-fester things is in there still, don't you think efforts by the bad guys are going on to find it? You bet your black and white flightless aquatic bird they are.

    And yes, auditing and testing old code sucks. And it really sucks (and bruises egos) when this supposedly okay code has a bug in it - especially a security bug. But has to be found. Or else you screw it and move on to a new version. And hope whatever pattern led to the original bug does not get replicated (amazing how often that happens).

    Working in corporate software development, I know the fun part is the coding and firing up the developed item. All the testing and confirmation is NOT fun. But that is what lets you finish the thing, move on, and not worry about hearing later. Even eight years later. But in these cases, it is not just to allow moving on. It is to maintain the confidence in the system and protect it. So yes, it sucks. But has to be done.
    jwspicer
    • jwspicer: "with monocultures come even more vulnerabilities"

      Monocultures result in more vulnerabilities!? This does not compute.

      Vulnerabilities result from poor design and implementation choices. Secure software development lifecycle processes can help to reduce vulnerabilities, with the goal being zero vulnerabilities.

      The problem with monocultures like Windows on desktop PCs and Android on smartphones is that the malware miscreants, especially the mass malware miscreants, tend to target the most widely-used software. They're big targets.

      Both advanced persistent threats and targeted attacks tend to target whatever software is used by the targeted individuals and organizations. Note that targeted attacks have been waged against OS X involving both organizations and individuals (e.g., Tibetan activists).
      Rabid Howler Monkey
      • Most used..

        is Linux - your router probably runs Linux, most of the Internet runs Linux, with all the Android phones, those that use Windows us a minority now.

        Unix had security problems initially, and they were solved. Linux has inherited this.
        Windows was owned by Microsoft, that planted "Commercial Exploitations" that they believed only MS could use - well, we all see where that road lead. Now they cannot fix it, because those who made the initial code has left MS, and if they code, do so on Linux.
        knuthf
    • Commercial...

      We are talking very commercial about Linux, thousands of people work with Linux every day, and none of these work for free (just so you know). Some get better paid than both Apple and Microsoft ever would pay them. A very good example here is Google - who use "Gobuntu", their Chrome browser is open source. The reason is that the systems gets to complex that you need the best - and the best are not for sale. They will not make compromises, so that sales gets to sell more for their company, they want to do it right, and be able to prove that it is right.

      So there are just as many behind the source code in Linux as in any other OS, probably more, and also collecting a paycheck every month, kist as those in Microsoft and Apple. But their loyalty is to those that use the systems.
      knuthf
  • We need to start with the hardware and compilers.

    We need to start with the hardware and compilers.

    Theoretically speaking, we shouldn't even be ALLOWED to create software with buffer overruns in it. We shouldn't even be ALLOWED to touch memory space we don't own. What's the use of technologies like DEP if they are easily overridden? Why are we still having problems at the low level of things?

    Unit testing and code reviews are great - but they are clearly not good enough. We need to make hacking things harder at the lowest levels as well.
    CobraA1
  • Let the peons debug

    After all these years, and still the same problems.
    Buffer overflow?
    Come on, folks!
    Sorry to say, but so-called programmers have no idea how to write code, nor have they ever heard of debugging.
    radu.m
  • It's a *Linux* problem?

    Serious fail, Stilgherrian, seeing as how the Linux developers don't have anything to do with GnuTLS.

    Anyway, I saw this quote years ago: "A C program is like a fast dance on a newly waxed dance floor by people carrying razors."

    If you truly want engineered code, use a language like Ada.
    Media Whore
    • Media Whore: "Linux developers don't have anything to do with GnuTLS"

      I would have stated, Linux kernel developers don't have anything to do with GnuTLS.

      Plenty of Linux devs with the Debian Project and Canonical, Ltd., work with GnuTLS as it's used in many packages.

      Your statement is a form of denial.
      Rabid Howler Monkey