X
Tech

Are public vulnerability disclosures ethical?

This age old debate has recently been rekindled by Finjan on the recent rapid disclosure of 10 possible Windows XP SP2 flaws. As it turned out, Finjan's motivation was highly questionable since they didn't give Microsoft a chance to fix the flaw before their disclosure to the public and they conveniently had their own software that they could sell you to protect you from their disclosure.
Written by George Ou, Contributor
This age old debate has recently been rekindled by Finjan on the recent rapid disclosure of 10 possible Windows XP SP2 flaws. As it turned out, Finjan's motivation was highly questionable since they didn't give Microsoft a chance to fix the flaw before their disclosure to the public and they conveniently had their own software that they could sell you to protect you from their disclosure. This happened to be a clear case of an unethical disclosure, but what about vulnerability disclosures in general or when something like the WPA Cracker proof on concept gets released in to the wild even when ample warning time is given? What are the ethical implications? Although I can see the arguments on both sides, I still have very mixed feelings about this issue.

Here are the facts: When a proof of concept exploit code or a software vulnerability is publically released even when ample time is given to the vendor to patch the flaw, that disclosure or proof of concept code will be misused by script kiddies and less skillful hackers. No matter how good the intentions are, this is a fact that cannot be denied. On the other hand, if vulnerabilities were never disclosed to the public, software vendors would never bother to fix them and many of their business customers wouldn't mind.

As bad as it sounds, the cold hard calculus of business economics dictates that it is better to "hear no evil and see no evil". The threat of a real professional hacker who can figure out their own exploits and code their own tools is an acceptable risk since there aren't that many of them and they probably wouldn't affect "me" anyway. Even if they do hack in, at least they won't brag about the hack in public because their motivation is greed. Script kiddies who do rely on public disclosures and public proof on concept code do the bulk of the real damage because they're in it for the bragging rights of defacing your public website. The Public Relations backlash of that would do even more economic damage than the hack itself, not to mention the cost of security patching. Now I personally believe that this type of thinking is a bit naive, but I must admit from a pure dollars and cents perspective it makes sense (any of you business types out there think my business assessment is way out of line here?).

After weighing all the issues, vulnerability disclosure is still not a clear cut decision and it never will be. But I think forcing the vendors to patch their software and the public to apply them is in the publics overall best interest. The only way to force vendors and businesses to fix their vulnerabilities is to publicity disclose them along with exploit proof of concepts if necessary, so long as a reasonable amount of time is given to the vendors to fix their software. Fixing the vulnerability is more important than hiding from them. Now that there are some new regulations coming online to mandate public disclosure of any computer break-ins, businesses may be forced to rethink the importance of good security.

Editorial standards