Why CERT should be decertified (1)

Why CERT should be decertified (1)

Summary: CERT is government agency: higher standards apply.

SHARE:
37
Last week the CERT Institute developed at Carnegie Mellon University and now part of the "the operational arm of the National Cyber Security Division (NCSD) at the Department of Homeland Security," issued an annual systems security review and summary that drew They're misleading the public. That's irresponsible at best, dishonest at worst. widespread public attention.

Both the computer press and the mainstream media used this report as the basis for headlines that looked like this one Linux/Unix more flawed than Windows, CERT says from News.com.

Here's the summary from The Washington Post:

 

According to US-CERT, researchers found 812 flaws in the Windows operating system, 2,328 problems in various versions of the Unix/Linux operating systems (Mac included). An additional 2,058 flaws affected multiple operating systems. There may well have been more than 5,198 flaws discovered this year; these were only the ones reported to US-CERT.

To the uninitiated that's pretty clear - Unix had almost three times as many problems as Windows - but it's the impact on Windows people who might have been considering a change that's most pernicious. Put yourself in their shoes: many of them have a hard time coping with Microsoft's continual patch requirements, now they're told -by the US Federal Government- that Unix is nearly three times worse.

So the question has to be: is there a problem with Unix and, if so, how serious is it?

Let's start by looking at the Unix and multiple operating systems sections of CERT's list. Here are the first ten Unix vulnerabilities listed:

 

4D WebSTAR Grants Access to Remote Users and Elevated Privileges to Local Users
4D WebStar Remote IMAP Denial of Service
4D WebStar Tomcat Plugin Remote Buffer Overflow
4D WebStar Tomcat Plugin Remote Buffer Overflow (Updated)
Abuse Multiple Vulnerabilities
Adobe Acrobat Reader mailListIsPdf() Buffer Overflow (Updated)
Adobe Acrobat Reader mailListIsPdf() Buffer Overflow (Updated)
Adobe Acrobat Reader UnixAppOpenFilePerform Buffer Overflow
Adobe Acrobat Reader UnixAppOpenFilePerform Buffer Overflow (Updated)
Adobe Reader / Acrobat Arbitrary Code Execution & Elevated Privileges

Two things should strike you about this:

 

  1. first, that a large number appear to be duplicates - (in fact 1,442 or 62% are duplicates of other listings); and,

     

  2. none of these seem to be Unix related - they're essentially all application related, and so are the 2,058 classified as affecting multiple operating systems.

Although most of the press simply ran with the numbers a few people did look at CERT's values a little more critically. Most notably, Joe Brockmeier and Barr at Newsforge.com, part of whose review reads:

 

This is not to say that the data from US-CERT is a meaningless aggregation. You can easily spot the most vulnerable operating system in wide use today by taking a look at the Technical Cyber Security Alerts issued by US-CERT this year. Here's the bottom line:

 

  • 22 Technical Cyber Security Alerts were issued in 2005
  • 11 of those alerts were for Windows platforms
  • 3 were for Oracle products
  • 2 were for Cisco products
  • 1 was for Mac OS X
  • None were for Linux

That's quite a different picture than the one the Microsoft press machine wants you to see. Here's more of the same. US-CERT's list of current vulnerabilities contains a total of 11 vulnerabilities, six of which mention Windows by name, and none of which mentions Linux.

They're right, but it's actually a lot worse than that.

Many of the listed vulnerabilities aren't even backed by CVE listings - but consider one that is: #231 - "Debian Lintian Insecure Temporary File". Cert describes this as:

 

A vulnerability exists because temporary files are created in an insecure manner, which could let a malicious user delete arbitrary files.

The alert was processed by CERT in January of 2005, but originated as a a bug report in June of 2004. Here's the Mitre CVE listing for it:

 

lintian 1.23 and earlier removes the working directory even if it was not created by lintian, which may allow local users to delete arbitrary files or directories via a symlink attack.

Mitre's source for this is a December 2004 Debian patch report.

Follow that up, and you find that the vulnerability was more imagined than real. Here's what Jeroen van Wolffelaar, the maintainer for that code had to say when the issue came to his attention on December 19th:

 

I noticed this before, but at that time didn't think it was a security issue. Directory creation would simply fail if that name is already taken, and the cleanup afterwards is harmless. If the name is not yet taken, no issue.

However, when re-reading, I see that this assessment was a misreading of the sources.

But a day later he reports:

 

Argh, after looking again, I still stand by my initial assessment, I was misleaded by the theory that the logic was bogus. The key point is:

| if (not -d "$LINTIAN_LAB" or ($lab_mode eq 'temporary'))
| mkdir($LINTIAN_LAB,0777) or fail("cannot create lab directory $LINTIAN_LAB")

And, this is correct. If $lab_mode is not temporarily, a lab location was specifically given to lintian, and we should assume that the invoker of lintian in that case knows what he does. In all other cases, i.e., lab_mode equals temporary, the condition in the if is true (note the 'or'), and the lab dir is unconditionally tried to be made, which fails if it already exists.

In other words, the problem never existed, was (erroneously) reported in mid 2004, cleared in 2004, and counted against Unix in a 2005 summary claiming the authority of the United States Government.

It's not the only one either. The overwhelming majority of the Unix and applications vulnerabilities listed either can't be exploited in current products, require absurdly unlikely circumstances to become actionable, or have no factual basis.

That same liberal use of imagination affects a lot of the listings in the section on multiple operating systems vulnerabilities as it affects applications that originate primarily on Unix. Consider, for example, this rather desperate attempt to see a vulnerability: in a widely used Unix application:

 

The apache2handler SAPI (sapi_apache2.c) in the Apache module (mod_php) for PHP 5.x before 5.1.0 final and 4.4 before 4.4.1 final allows attackers to cause a denial of service (segmentation fault) via the session.save_path option in a .htaccess file or VirtualHost.

This is based on the following:

 

From: Eric Romang / ZATAZ.com (exploitszataz.net)
Date: Mon Oct 24 2005 - 02:36:38 CDT

Hello,

Here under some stuff to dos apache + php just through an htaccess.

* With .htaccess method :

If you have into your php.ini -> safe_mode = On

Simply put a .htaccess file on the root directory of your website with this content :

php_value session.save_path /var/www/somewherehowexist

Apache segfault with :

[Fri Sep 30 10:33:11 2005] [notice] child pid 17743 exit signal
Trace/breakpoint trap (5)

There was a bug in the apache2handler SAPI, sapi_apache2.c file, that made this segfault here possible, the bug now is fixed upstream and 5.1.0 final, 4.4.1 final and the next 5.0.X release will have the patch.

Also work with session.save_path into a VirtualHost.

Gentoo bug report :

http://bugs.gentoo.org/show_bug.cgi?id=107602
and
http://bugs.gentoo.org/show_bug.cgi?id=98871

 

In other words, this was a bug that found and fixed in pre-release testing.

CERT's excuse here is that it just publishes the facts - a list of claimed vulnerabilities - and if the public doesn't read the fine print, well, that's not their problem is it?

But, of course, it is. CERT is government agency: higher standards apply.

CERT issues this kind of public information in the full knowledge that both the mainstream media and the technology press will run with it. Basically, they can't help but know that they're misleading the public and that's utterly irresponsible at best and dishonest at worst.

CERT isn't responsible for educating the mainstream press, but they can, and should, respond to predictable ignorance among their primary audience by providing a more meaningful metric than simple counts of claimed vulnerabilities. Something along the lines of cancer rates per 100,000 population would work nicely - weigh the risk associated with each vulnerability according to the number of systems per 100,000 in the installed base that are actually affected and you'd get a reportable statistic ranging from zero for most of the claimed Unix vulnerabilities to nearly 99,999 for something like the current WMF scare (reported, by the way, in 2005 but not counted against Microsoft in this list).

Periodic summaries, like the current year end report, could then give cumulative risk statistics in percentage terms - something the main stream press could easily understand and pass on to its audience.

I'll have more on this next week but for now - if you're American, how about calling or writing your congressman and senator and raising the issue with them? Barring that, if you're a Unix vendor and you've lost a sale to security concerns - consider asking a lawyer how much liability CERT could be assessed for.

Topic: Operating Systems

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

37 comments
Log in or register to join the discussion
  • Apalling

    The biggest issue here is - just what constitutes "Linux"? If I have an extremely bad app (security-wise) and I get it placed in a Linux distro - then the distro is considered a security problem?

    The WHOLE POINT of *NIX is to have a SEPARATE operating system "layer" from the applications. Windoze commingles their OS and apps and causes many security issues. These Linux distros are doing the same (stupid) thing - throw some apps in with the OS and cause problems.
    Roger Ramjet
  • Thanks Murph

    You 'Got the Facts' ;)
    George Ou, well needs to 'Get the Facts'.
    He spews more sensationalism and misinformation in this blog of today.
    D T Schmitz
  • Interesting

    [i]"...have a hard time coping with Microsoft's continual patch requirements"[/i]

    No system administrator should have a problem patching their system with vendor-supplied patches, I don't care how often they come out.
    Real World
    • Its not that

      In a large enterprise (such as Company "F"), new patches need to be CERTIFIED before they are spread out to all 170,000 PCs in the company. This means that testing is required to see if the patch breaks anything. This effort takes time, effort, $ and the more M$ releases, the more this effort happens. Now THAT is "the problem".
      Roger Ramjet
      • How is that any different for Unix systems?

        I'll tell you how: Their patches don't come out on a consistent cycle. That has pros and cons, but One cannot fault Microsoft for taking a more admin-friendly approach. Windows admins can plan their rollouts. They know the patches will be released the second Tuesday of every month, so they can go to testing for 1 week then get rolled out the following week. Unix patches are released helter-skelter, and may require more testing due to the potnetial wider range of system configurations.
        Real World
        • Nah

          Windoze patches are NEEDED about a week before they are released. There is very little time for testing because the exploit code is out. I'm sure this WMF patch was RUSHED into production because of the eminent situtation.

          Unix patches are usually for innocuous things like driver updates. I've never seen anything that's as immediate as the WMF stuff on *NIX.
          Roger Ramjet
          • Which brings us back to

            the surface area principle*. And I would call any patch a vendor releases 'NEEDED'. Otherwise, why release it?

            *More installations = more people working on it = more less-qualified people working on it = more exploits of vulnerabilities that already have a patch.
            Real World
          • Reset

            Most UNIX patches I've seen are for things like "flaw in certain library (which you usually never use) can cause crashes" or "Clock loses time" or "Using this app can delete your file". I never see things like this WMF disaster. Just HOW would you describe it in a patch? All files on system succeptable to being overwritten? I guess I'm talking scope here, as the flaws in UNIX never get close the the severity of Windoze.
            Roger Ramjet
          • That actually brings us back to Roger's first post.

            Roger's point was not [b]if[/b] it was needed but [i]when[/i] as in, "weeks before the patch comes out" as in the case of the WMF vulnerability and Internet Explorer to surf to an infected website --noting that IE is integrated into the OS-- or, "when you finally decide to use this or that particular application" as in the case of the WMF vulnerability and WINE in conjunction with CrossOver Office and Microsoft Word to read a document with an embeded, malformed WMF --not usually installed or used by the OS user's where it applies even if it were not patched long before MS delivered their patch.

            Please also note that this "threat" to Linux was more perceived than real as born out in the talkbacks to Ou's published remarks.

            One of the advantages of the Linux Distro model is that even if Linux is defined as the distro including the kernel and all the apps distributed with it, none of the apps are integrated into the OS and most are not loaded by default and all are optional installs. WIth Windows, IE is integrated, loaded by default and cannot be choosen to be omitted during install nor removed after the fact.

            To me, that makes IE part of the Windows OS but even if every Linux distro includes Firefox, Mozilla, and Gnome, it doesn't make GTK+, Gecko, Gnome, nor Fx part of the OS by a long shot since all are optional and uninstallable.

            Also, note that the patching system in linux distros are not on the same vein as MS patches. For instance, to patch only my WINE on Gentoo Linux, I do:
            [code]
            emerge WINE
            [/code]
            and to patch my entire system I do:
            [code]
            emerge system
            [/code]

            Most patches do not require a reboot. I have recompiled kernels without a reboot. That is called business continuity. A must in the enterprize. This is one reason why MS updates is a problem to many an admin such as myself who works in an MS shop.
            The King's Servant
        • Flexability

          "They know the patches will be released the second Tuesday of every month, so they can go to testing for 1 week then get rolled out the following week."

          There's nothing that prevents UNIX admins from doing a monthly patch after a week of testing.

          "and may require more testing due to the potnetial wider range of system configurations."

          Buinesses will most likely to have a standard configuration of UNIX, just as they would with Windows platform.
          SGT_Spam
          • I was talking about

            servers, since Unix is on virtually none of the business desktops in the world.
            Real World
          • Depends what you mean by UNIX desktop

            There are over 12000 UNIX workstations being used today at Company "F". They exist on desktops - usually side-by-side with PCs (unless its a Sun workstation with its PCi card). Are they business? Well many of these workstations are used for CAD/CAM/CAE - so is that buiness?
            Roger Ramjet
        • Another difference

          Unix OSes all use reliable package managers that can roll back patches as needed.

          I've never had a patch to anything but the kernel kill a unix machine so bad it took physical intervention to roll back the patch.

          All other patches can be rolled back en masse or individually by remote access by the sysadmin. In Windows, many of the patches are NOT CAPABLE of being rolled back. Once you install them, they are there, and if you don't like the way your system behaves your options are more severely limited.
          Sxooter_z
  • Point of view ..

    It depends, as does the results of this CERT study, on the angle you'r looking at the 'problem'.

    The person who wrote this study probably wanted to keep everybody happy...for the $ or ? which were invested in this research.

    Maybe we should award this CERT employee/the CERT with a Certificate of delivering useless information (for an invaluable price)....

    BTW; i often read technical articles in (news)papers and magazines about products, which are a summary of the release notes the producer of the product provided... Maybe it's time as well to create a new type of education for journalists...
    Arnout Groen
  • Duplicates?

    Duplicates don't seem to bother you people when they are about Windows. Like when a supposed brand new virus is a whole new vulnerability, but in reality is just a variant of a previous virus that takes advantage of an already patched flaw.

    Does having 2 faces hurt?
    vdraken
    • Who is "you people"?

      Who exactly is this group you refer to as "you people"? My guess would be that any Linux advocate would fall in that category.

      I use and advocate Linux where it makes sense. I guess that would make me one of your "you people". Contrary to your statement, I don't wish to see Linux OR Windows the target of misleading information. A vunerability should be counted only once for a given platform, regardless of how many exploits for that vulnerability there might be.
      mosborne
    • In that case...

      Should the patch be decleared null and void since the variant can take advantage of it? If so the number of vulnerability outstand wouldn't change.

      Besides, claiming it as a new vulnerability isn't as bad as counting a Acrobat vulnerability for Mac OS X as a Linux vulnerability.
      SGT_Spam
    • The virus is not counted as a vulnerability.

      The vulnerability itself is counted. But as I have pointed out once with Fx, a certain "security" company counted one vulnerability in a pre-release of Fx (version 0.7) as a vulnerability six times after release 1.0 was released. The vulnerability was patched in 0.8 and onward but there were two ways the vulnerability could have been used (both ways counted to this firm) and one way could have resulted in two different actions (both counted) and an update to each vulnerability were logged (both counted) creating six alerts to one problem that was solve in pre-release of the product but never mentioned until after the 1.0 release.

      In other words, a non-vulnerability was counted six times as a vulnerability. Compare that to W2kSP4 that had two vulnerabilies that had two seperate patches released at two separate times followed by a patch to one of the patches following after. Was that one vulnerability or four? If a fix leaves two holes and one of the fixes to those two holes left another hole, that's three new holes that never existed before W2kSP4.

      Have you loaded the .NET framework lately? We can talk about v1.x or 2.x. Your choice. Up to quite recently, MS refered to .NET 1.o as an optional software update which would have been followed by the .NET SP1 which would have been followed by the .NET 1.1 Hotfix. I finally installed all that (as I required it to test a certain web app) and now it finally offers me the .NET 2.0 framework --all of which required rebooting.

      Note too, that the .NET 2.0 framework was out a long time ago.
      The King's Servant
    • Totally agree

      In addition you never see them people trying to categorise vulnerabilities into App and OS when the problem is on Windows.

      They're always quick to forget that a windows vunerability may exist because the app designer messed up. It always comes back to MS's fault.

      This article is a prime example. Where are the figures for how many of the windows faults were duplicates or app problems? I accept that it may be a smaller percentage but the total absence of these figures shows the mind set.
      Fujikid
  • Does it really matter?

    Oh I know, the *nix folks think so and like to use *security* as their marching theme but does it really matter to the market?

    The fact is, either *nix or Windows can be set up very securely and that is the real issue. Just as everyone can have a door on the house, the difference is how it is locked.

    Claims by this group or that group about who is "most secure" really boils down the groups view point and their bent on the subject. Most people understand that and take reports and such with a grain of salt and keep on going about their business.
    No_Ax_to_Grind