Who's accountable (or liable) for software security?

Who's accountable (or liable) for software security?

Summary: Bruce Schneier has added his viewpoint to the debate that started with Howard Schmidt's comment that programmers should be held personally accountable for the quality of their code. In a Wired News column, Schneier writes: He's on the right track, but he's made a dangerous mistake.

SHARE:
TOPICS: Security
17

schneier.jpgBruce Schneier has added his viewpoint to the debate that started with Howard Schmidt's comment that programmers should be held personally accountable for the quality of their code. In a Wired News column, Schneier writes:

He's on the right track, but he's made a dangerous mistake. It's the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

Security expert Schneier makes the mistake of thinking that Schmidt (former White House cybersecurity adviser as well as Microsoft and eBay security czar) said that programmers should be held liable--he just said that programmers should take personal responsiblity for writing secure code, and that performance reviews take into account whether their code adheres to approved security models. He also advocates more training for developers to help improve the level of security in the products they build.

Schneier goes on to say in his Wired News column:

He [Schmidt] wants individual software developers to be liable, and not the corporations. This will certainly give pissed-off users someone to sue, but it won't reduce the externality and it won't result in more-secure software.

Computer security isn't a technological problem -- it's an economic problem. Socialists might imagine that companies will improve software security out of the goodness of their hearts, but capitalists know that it needs to be in companies' economic best interest. We'll have fewer vulnerabilities when the entities that have the capability to reduce those vulnerabilities have the economic incentive to do so. And this is why solutions like liability and regulation work.

Again, Schmidt never said he favors making individual software developers liable, and he would likely agree with Schneier that the economic incentives (such as, stop buying insecure products) haven't reached a tipping point. Schneier favors liability and regulation targeting the companies. Holding companies liable for negligent coding makes sense to me, but in my conversation with Schmidt, he said that he doesn't favor legal remedies. As Schneier concludes:

If end users can sue software manufacturers for product defects, then the cost of those defects to the software manufacturers rises. Manufacturers are now paying the true economic cost for poor software, and not just a piece of it. So when they're balancing the cost of making their software secure versus the cost of leaving their software insecure, there are more costs on the latter side. This will provide an incentive for them to make their software more secure.

The question is how to define what you can sue a software vendor over. Without more clear parameters about what kind of software vulnerabilities and incidents justify legal remedy, the ambulance chasers will have a field day...

Topic: Security

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

17 comments
Log in or register to join the discussion
  • Depends on how much you paid.

    You shouldn't be able to sue someone who provided you the software free of charge, source code and all. It makes sense, though for paying customers to have some recourse, but damages should not exceed the purchase price of the software.
    John L. Ries
    • Limits

      I agree this is a very unclear area. If I take a propane grill off the curb (meant for the scrap man) and upon firing it up it explodes, do I have a right to sue for damages? Probably, but In a perfect world I would not. If I buy a new grill and it explodes (Due to product defect, not misuse) do I have the right to sue? Definately. I lost the right side of my face, one eye and am horribly disfigured, have medical bills in the thousands, and I can sue for the $219.00 I paid for this little propane bomb?! (Let me make it very clear this is hypothetical, this has not happened to me or anyone I know, or even heard of for that matter!). In taking something for free AS-IS should be understood. When paying for a product it should perform as advertised. There will always be unknown defects, that is the nature of invention. But to sell products with known defects, IE the Pinto gas tank issue, The Firestone 721 radial tires, etc. there was a known defect, but economically it was decided my the company to sell anyway, as liabilities were less than revenue. There should be some liability above the price of the product. Where to draw the line? If I had answers like that I would not be working as a tech and posting here, I would be putting together a plan for world peace.
      Seenidog
      • Hmmm, but if you buy a known defective product.

        If you buy say a case of dynamite and it goes off accidentally while your present, is that the manufactures fault or does it come under the "damn fool rule" meaning you accepted the hazards of using/handling dynamite when you bought it. To make it realistic compared to software, some group of unforeseen circumstances came together at just the "wrong" time that caused the dynamite to go off. Say it is a dry day, the winds are hot and dry from the west, the plastics delivery truck parked next to the rain gutter of y09our building, a static charge built up and a spark jumped to the gutter, the gutter wasn't ground because your secretary hit it earlier that day with her car, the charge traveled into the building via the steel beams and bam, it's over. The same sort of weird situations are BOUND to happen when you have every conceivable hardware platform and software installs in the world running the software. In all my years I have never seen two identical PCs if they have been in service more than a day.

        Or better yet, to use your propane bomb analogy, what if you buy it from someone's yard sale, are made aware it may not be perfect, but because it's the cheapest thing meeting your needs you accept the risk and buy it. Who's fault is it now? Isn't all software very much like that? You KNOW there is no such thing as perfect software, you KNOW at some point it will fail and in all probability will represent a loss to you, but you still buy it while being in full knowledge of the facts. At that point how is it anyone's fault other than yours?

        You point out that these are known issues, well, so was the faulty propane bomb you accepted, so is handling dynamite, so are a lot of things. The plain simple fact is people make a trade off of cheap function over perfect anything. That is NO ONES fault other than the consumers and it should and must be the consumer that accepts the full responsibility. Especially when the manufacture is UP FRONT and tells you the product may have issues, may not fit your needs, and will not be responsible for your actions, inactions, suitability to task, or losses.

        Good grief, do you a tree to fall on you???
        No_Ax_to_Grind
        • Just to add to it...

          I said: "To make it realistic compared to software, some group of unforeseen circumstances came together at just the "wrong" time that caused the dynamite to go off. Say it is a dry day, the winds are hot and dry from the west, the plastics delivery truck parked next to the rain gutter of your building, a static charge built up and a spark jumped to the gutter, the gutter wasn't ground because your secretary hit it earlier that day with her car, the charge traveled into the building via the steel beams and bam, it's over."

          I want to add to it by saying that while all this is happening you have to keep in mind there are idiots just outside your door throwing lit matchs in with your dynamite just for the "fun of it". (Hackers, Spyware, Virus, Trogans, you name it.)
          No_Ax_to_Grind
    • MS will love this

      Every body downloads IE and Media Player free. MS is not liable as per your post.
      treg
      • But IE and WMP are part of Windows

        So integral MS can't let you uninstall them (even if you don't have a sound card, in the case of the latter). Since that has been MS' public position for the last 7-8 years, they would be liable.
        John L. Ries
  • The Programmer's Corporate Liability

    I agree that there should be fairly well defined grounds for liability for insecure or ineffective software. The damn EULAs make absolutely no promises beyond the fact that there might be some software on the disk that the owner claims is his.

    I think that if a company wants to exercise its rights of copyright then it MUST define what functions the software will perform in sufficent detail so that purchasers can determine before they rip open the shrink-wrap that the software will do the job. Microsoft is not the only company that promises to increse your ROI or some other BS without concrete proof. If the detailed definition does not exist, then the copyright is null and void. If the software can be demonstrated not to do the functions for which the owner claims it will do, then the "copyright" owner has to pay up in an automatic class-action payoff in cash or check to all of the purchasers of the software. The purchasers could have the option of receiving free updates to repair the damaged functions.
    Xwindowsjunkie
    • Sorry, you don't get to write the copyright laws.

      I suggest that if you want to change them you enroll in a good law school asap.
      No_Ax_to_Grind
    • Differences

      There are two sets of differences here that need to be addressed. First of all, there is a difference between "insecure" and "ineffective" software. In the latter case, the software is indeed not performing as advertised, and there should probably be some liability on the part of the software vendor. Software that crashes would fall in this area, but how long has it been since we heard much about software crashing? In the former case, there is no true "defect" in the software. The software performs as expected, so long as the vendor doesn't claim perfect security for the software. No problem arises until a 3rd party takes action, which leads me to my next point.

      Secondly, there is a difference between "defective" and "insecure". People keep making the mistake of trying to equate defective products (such as a faulty propane tank, or faulty brakes on a car) with software security "holes". In the former case, no action is required on the part of the consumer for the flaw to manifest itself; it will arise during the normal use of the product. In the case of insecure software, however, the software performs as expected during normal operation by the consumer. It's only through the actions of 3rd parties (malware writers) that problems arise. If it weren't for virus and spyware writers, almost none of the recent Windows "flaws" would even exist - the flaws themselves are defined by the ability to exploit them.

      Now certainly, there are cases where true flaws (such as buffer overflows) exist, and such flaws are indeed the responsibility of the software vendor to correct. Such flaws exist in all software, even the much-touted "many eyes" open-source. But even so, in most cases these buffer overflows don't cause a problem directly, they simply leave a "hole" that a 3rd party can exploit. If no 3rd party ever exploits the hole, is it really a hole?

      On a final note, I will say that, even though I don't see most of the so-called Windows security "defects" (as some like to call them) as cause for legal liability on the part of Microsoft, it would still be good corporate policy to address such issues, if nothing else for public relations. Companies always get ahead by catering to their customers, even when the customers are wrong. Microsoft has a history of catering to its customers. It would be better if Microsoft were to just bite the bullet and make Windows more secure, and spend some money on educating the public as to why some software may not work as well under tighter security.

      Carl Rapson
      rapson
      • Security and functionality...

        "It would be better if Microsoft were to just bite the bullet and make Windows more secure, and spend some money on educating the public as to why some software may not work as well under tighter security."

        That sentence raises an interesting issue--if certain software doesn't work as well under tighter security, how would that affect the public's feelings on a) that product and b) security? Would the public be willing to accept whatever idiosynchracies occur as a result of higher security (preventing direct access to certain components of software or hardware that would optimize performance or functionality)? And to apply it more closely ot the topic at hand, would people be willing to accept the consequences of software liability laws (lengthened development cycles, increased software prices, etc.)
        Third of Five
        • Which is where...

          ...the education part comes in. After all, it seems that one of the basic assumptions underlying the desire to have the computer-using public move to Linux is that they will, indeed, accept less functionality in exchange for more security. If not, what chance does Linux ever have?

          Carl Rapson
          rapson
          • Re: Which is where...

            I certainly did not intend to open that particular can of worms. Honestly, I've never even used Linux, so I can't really say anything about specific functionality issues.

            But it would probably be a bit difficult to get a massive transition to a different operating system in any case, even when all other things are equal.
            Third of Five
  • For crying out loud, grow the frell up and act like an adult.

    If security or bug free software is desirable then do yourself and the freakin world a favor and shop for something that meets your needs! Do NOT go for the lowest cost product with the features you want and then whine about your choices.

    I can show you operating systems/hardware platforms that are unbelievably secure and for the most part bug free. (The apps you run are your choice so buy secure ones) The only requirement is you get your wallet out and possibly take out a second mortgagte on your first born. That is YOUR CHOICE. Deal with it.

    Users and yes even all knowing IT people get EXACTLY what they want and desire and they do it over and over again with upgrades, add ons, or completely new versions, etc...
    No_Ax_to_Grind
  • Accountability

    The most simple way to hold the companies accountable for secure software is not to buy defective or insecure software. I know that this sounds tautological, but it seems like one of the less intrusive ways to deal with this.

    To some extent, the purchaser understandably assumes some level of risk when purchasing software, just as he assumes certain risks when downloading software off some random freeware site. That said, the EULA (specifically the part where the manufacturer expressly denies any responsibility) is severely tilted in favor of the manufacturer.

    I think a lot of the "who's accountable" issue lies in the belief that it is unconscionable that this industry seems to be the only one in which the customer has no real protection against defects or damages.

    As for the issue of "you know what you're getting to vis-a-vis the bugs," most vendors are not likely to be up front with an exhaustive list of defects in the software, or they themselves may not have found every bug yet.

    The main issue is, since anyone who has ever done anything more complex than "Hello World" knows, perfect software is virtually impossible. There comes a point at which you can only exterminate so many of the bugs before you wind up creating new ones. This brings up another issue for accountability: what would constitute an acceptable threshold, and what would be counted against this threshold? (One example is the fabled "64,000 bugs" in Windows 2000, many of which were actually more issues of style and "couldn't we do it like this" than serious flaws.) For that matter, how would this threshold be calculated? Would it be a universal and concrete number of flaws, or would it be proportional according to various factors (complexity of the code, purpose of the program, cross-platform considerations*, etc.)?

    *This could be an issue both for professional developers and hobbyists. One particularly interesting example would be the transitional period for Apple going from PowerPC to Intel chips. It will be interesting to see how many developers will throw their hands up in concession and simply code to one spec rather than code for both and bear the accompanying burden of having to resist the temptation to optimize for one platform over tht other. But that's another matter.
    Third of Five
  • An issue 25 yrs. ago

    I didn't see the word "malpractice" anywhere. Was this term deleted from any dictionary? Individual consultants (paid by a client directly) should be responsible for weak code. It's the risk they take being on their own and making all that money. A software house, regardless of where they're located) should bare the brundt of responsibility for bad code generated by the project team. The amount of culpability would depend on how much the client was damaged economically. This would depend on the courts. Writing good solid contracts would alleviate much of the heartburn when things go wrong.

    Sound policies and procedures (structured skills, good doco, sound testing criteria and peer review) would significantly reduce exposure but are the software houses willing to spend the time? The sooner a project goes to client acceptance the larger the profit margin. This has been Microsoft's achilles heel ever since Windows 3.1 came out.

    This is where IT contract lawyers earn their fees. A well developed contract benefits both parties.
    jack@...
    • Contracts and licenses

      Except that 95% of all users don't have a true negotible contract for their software. They buy software off the shelf under an End User Agreement which is totally imposed by the lawyers of the software company.
      As someone else mentioned we could get our own personal secure software by paying huge amounts for private software developers to write it for us. That would mean that big corporations and very rich individuals would be the only ones running computers (And Bill Gates would not be one of those people, because without mass produced software the chairman of Microsoft would not be the richest person in America.
      Instead we mass produce software. We also mass produce automobiles. Strangely enough we insist that the cars we buy don't blow up and make the auto companies responsible for ensuring that their cars do not blow up. We even insist that their cars do not blow up when a third party comes along and hits their cars. Now this example could be carried too far, since we don't insist that cars resist third parties that attempt to blow them up with rocket propelled grenades. But a piece of spyware or a virus is not an RPG it is a rock thrown at our car and we have a right to expect it to not make our car blow up.
      carlino
  • RE: Who's accountable (or liable) for software security?

    <a href="http://www.dvdtomovmac.net"><b>DVD to MOV Mac</b></a> the most widely used DVD to MOV Mac Converter which can rip DVD files and convert to MOV.
    few35