In this month's CIO magazine, Bruce Schneier publishes one of his best columns ever. "How to Sell Security" starts with a common-sense argument about the psychological dynamics of why and how we as humans respond to sales pitches. It ends with this astute observation about why some computer security companies can't help selling fear:
Security sellers ... are continually trying to frame their products in positive results. That's why you see slogans with the basic message, "We take care of security so you can focus on your business," or carefully crafted ROI models that demonstrate how profitable a security purchase can be. But these never seem to work. Security is fundamentally a negative sell.
One solution is to stoke fear. Fear is a primal emotion, far older than our ability to calculate trade-offs. And when people are truly scared, they're willing to do almost anything to make that feeling go away; lots of other psychological research supports that. Any burglar alarm salesman will tell you that people buy only after they've been robbed, or after one of their neighbors has been robbed. And the fears stoked by 9/11, and the politics surrounding 9/11, have fueled an entire industry devoted to counterterrorism. When emotion takes over like that, people are much less likely to think rationally.
(Don't just settle for that small excerpt. Go read the whole thing.)
I've written about this before (see The security software industry wants you to be afraid, from February 2005), and the observation is still true. One thing Schneier doesn't mention in this essay is that this psychological reality provides a powerful economic incentive for a security vendor to find something, anything, and then to make some noise about what it found. Not so much that you'll be annoyed, but just enough to let you know they're on the job. Paradoxically, as I observed in the real-world example that inspired that earlier column, detecting a false positive can be economically more valuable than correctly ignoring it:
Joe feels good because the software told him it had protected him, even though the likelihood that this was an actual attack is microscopic. The lesson that Joe is unwittingly sending to the vendors in question is, “Give me more false positives, because the more times you tell me you’ve protected me from something, the more I’ll feel like I’ve gotten my money’s worth from your software.” If he had a better security program, it would have realized that this outgoing connection was just fine and would not have given him any warning at all.
That is just wrong. On a healthy computer with multiple layers of security, most threats should be blocked or neutralized before the user ever sees them. Getting lots of warnings is a sign that one of those layers isn’t working as well as it should. But that’s exactly the opposite of what motivates developers of security software today.
The recent research report released by Australian security vendor PC Tools, which I wrote about last week in Puncturing the myth of the invulnerable OS, is a perfect example of that technique taken to an extreme. Taken at face value, the research data led to a reasonable and fairly obvious conclusion: Windows Vista's security is much-improved over its predecessors, but it does not offer protection from every avenue of attack over a network. For that, you need a multi-layered security strategy that includes user training and accurate, up-to-date antivirus software that works as unobtrusively as possible.
The trouble, as Schneier notes, is that we humans find it difficult to buy based on that perfectly rational approach to security. Instead, we fall for the fear: "ZOMG, viruses! Trojan horses! Cookies!" That response is what allows security software companies to expand their product lines with additional products (for an additional cost, of course), and why I get e-mail and comments from people proudly listing the five separate security products they have running at all times. A couple of anti-spyware programs, antivirus software, a firewall, and invariably one extra layer of voodoo software.
That seems crazy to me. There's a big difference between a healthy understanding of the risks of using the Internet and bug-eyed paranoia. Going overboard on security seems as unwise as going out completely unprotected. Finding the right balance takes a little extra work, as Schneier notes: "[Y]ou can never ignore the cognitive bias embedded so deeply in the human brain. But if you understand it, you have a better chance of overcoming it." Exactly.
So, what are the layers in your security strategy?