More FOSS security scare-mongering

More FOSS security scare-mongering

Summary: With all the talk of open source and the Obama administration, it shouldn't come as any surprise that the scare-mongering around FOSS security is going to be close behind -- and here's part of the first wave, fresh from Ernest M. Park.


With all the talk of open source and the Obama administration, it shouldn't come as any surprise that the scare-mongering around FOSS security is going to be close behind -- and here's part of the first wave, fresh from Ernest M. Park.

Park is using a single data point (the Debian SSL issue from last Spring) to try to build uncertainty around the readiness of FOSS for government work, even though he admits proprietary software may be no more secure than FOSS. Here's what Park has to say:

Now one of the arguments for open source is that their are more eyes looking over the code, since the code is openly available to be reviewed and changed by the community. This is true and one of the reasons that this bug was discovered. The open source system of discovering bugs is beneficial in that the number of people reviewing the code is far greater than proprietary software. But as the Debian OpenSSL case shows us, it might take up to two years before it is discovered or at least published. Within the past two years, this bug may have already been discovered and not published, with the finder exploiting the bug for all that time. The problem with community review is that it is a voluntary choice and not an obligation.

The problem with Park's argument is this: Access to code is not necessary for discovery of vulnerabilities. Plenty of security holes are discovered in proprietary products without the results being published. Plenty of security holes have existed in proprietary products and been exploited long before the fix was available.

If Park wants to raise concerns about software security, he might start by asking if Microsoft is ready for government work.

Topics: Security, Government, Government US, Open Source

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Open Source is vastly superior

    When you have an Open Source project it has greater flexibility and the concept works when people contribute because they want to work on the project not something they begrudgingly are working on because it is their paycheck.

    The closed source route is just to slow to change with market trends, plus you are at the mercy of the vendor to wait for a patch or well we will try to get this in the next release.

    Also, with companies wanting to save money I cannot see mass license renewal of software if it is not really needed for instance 'MS Office' comes to mind.
    • Actually statistically no it's not

      At least if you're talking about OS's and browsers. And yes bugs are found in both but it is easier to find them when you have the source. And don't forget the US Gov doesn't have to wait for proprietary fixes, those contracts get them on private patches...
      Johnny Vegas
  • The rate at which distros are refreshed, increases bug spotting

    the recent patches for MSFT Windows were for vulnerabilities dating back to 1999 (Win2K).
    So it would appear proprietary code isn't being scrutinized *unless it has to be.*

    How on earth would a bug last that long if the code is constantly perused by thousands of eyeballs for the next OS update?

    Proprietary coding teams are never as large as open-source, because they have to pay for the labour.

    Also, never forget, Microsoft hires noobs straight out of college, which is more likely to add defects than they fix. ;)
  • Abandoned projects

    Open source software is far more secure than closed source software, no doubt about it. However, and this applies to open source as well as to closed source: *abandoned projects* do pose a risk.

    A software application which is no longer actively maintained, won't be patched ever again. So if it's a widely used app, it's security vulnerabilities can be exploited easily and with much yield.

    So if you want maximum security on your machine, run only that open source software of which you know that it's being actively maintained. That means among other things: no legacy apps.
    • Or use risky apps in a sandboxed virtual machine. (nt)

  • *sigh*

    More blather about security.

    How about doing a story which lays out the known and potential attack surfaces for the various OS's. You know, something that might actually be more useful than trotting out "My OS is more secure than yours" arguments.

    That being said, I do point and laugh at a higher proportion of Windows "security" people than I do *nix and other security people. It may just be me, but it seems far more likely to find security smarts outside of the "Windows world" than in it, well, that is in the sense of people who can defend their systems as opposed to crack them.
  • Scare-mongering if it involves FOSS. Truth if it involves MS.

    Double standard at its best. Like the typical freetard that you are.
  • RE: More FOSS security scare-mongering

    Mr. Brockmeier,

    I appreciate your response to my post at FOSSBazaar,
    but I feel that you missed the point.

    You said . . . "Access to code is not necessary for
    discovery of vulnerabilities.".

    That is the point. Aside from access, there is nothing
    inherent in open source software that makes it any
    more secure or less secure than commercial, and
    programs can be exploited with and without access to
    source code.

    Security testing in commercial software in either
    defined or implied as part of licensing and support
    agreements. There is a direct liability and a
    responsibility to a vendor who has a vested interest
    to actively test for issues and resolve them. Despite
    this effort, the results of the commercial
    applications are no better, and in some categories,
    marginally worse than FOSS.

    FOSS should be significantly more secure. As a
    community, we have no control over commercial
    software, but we as a group are the vendor and the
    user for FOSS. If we don't take the responsibility to
    make it better, are we hoping that a commercial
    company will do it, under the guise of improving the
    community experience? If we are the user community for
    FOSS, then the responsibility to manage the risk
    around FOSS is ours as well.

    The number of issues per application type reported on
    the National Vulnerability Database are nearly equal,
    regardless of license. This is not a fault of open
    source, or a benefit afforded to commercial. Neither
    is more secure or less.

    The opportunity for improvement through cooperation is
    directly within our control via FOSS. In and of
    itself, software is not safer just because source code
    can be reviewed. Clear auditable tests, performed by
    the community, following guidelines, tests along the
    lines of weaknesses described by,
    and uniform processes to secure and audit code within
    a community will make code safe.

    Source code availability without a clear, objective
    and transparent ongoing security review process is not
    safe, regardless of licensing.

    Free software permits us "freedom", not cost savings.
    With freedom comes responsibility. We don't just trust
    that FOSS is good and it is safe by association. We as
    a community adopt and evolve standards and methods for
    ongoing review and testing, thereby making all code
    pass a verifiable scrutiny, not a "potential"

    I cite Debian and OpenSSL since it is an example which
    directly affected the US government and companies
    across our country. I am a proponent of the ideals and
    freedoms offered to expand knowledge, learning and
    enterprise afforded by FOSS. I don't spread
    unsubstantiated FUD, and I am not an idealist
    believing that FOSS is safe. FOSS affords me an
    opportunity to do many things, but we as a community
    must accept that without a vendor to blame, we as a
    community may need to accept the responsibility for
    defining and enforcing security audit standards for


    Ernest Park
    e at airius dot com