X
Tech

Security fails without usability

When you make security hard to use, users look for a way around it. That's why efforts to make the internet more secure must be held to a high usability standard.
Written by Larry Seltzer, Contributor
security-quote1-452x239[1]

There's a general trade-off between usability and security. It's an old phenomenon, going back well before the computer age. General Benjamin W. Chidlaw, while commander in chief of the joint service Continental Air Defense Command (part of what eventually became NORAD) in 1954, put it this way:

    Simply put, it is possible to have convenience if you want to tolerate insecurity, but if you want security, you must be prepared for inconvenience.

We hadn't yet invented the word "usability" in 1954, but in this context it means pretty much the same thing as convenience.

Flash forward to 2014, and it's still the case that if it were convenient to be secure, there wouldn't be so much insecurity about.

Professional security software has always tended to be difficult to use, at least to use it properly. Perhaps the classic example is PGP (Pretty Good Privacy), a program written in 1991 to secure email. PGP uses symmetric public key cryptography and hashing to allow users to exchange messages securely and to prove the provenance of those messages.

PGP has always been high-quality software. The Federal Government was so alarmed at the prospect of people communicating securely that they opened a criminal investigation of Phil Zimmerman, PGP's author. At the time, US export regulations deemed cryptographic software that used keys larger than 40 bits to be munitions (!) requiring a special license.

But PGP is a clumsy multi-step process, requiring users to keep track of other users' public keys. There have been attempts to integrate it into more popular email programs, but I've never been impressed with any of them. It's amazing how little the usability of PGP has improved in 23 years.

Recently I wrote about the latest attempt to make secure email accessible: Mailvelope. It's a PGP-like system designed to work within any webmail program. Google also recently announced an effort to make end-to-end encryption usable within GMail, and Yahoo has announced that they will do the same by forking Google's implementation. Mailvelope is easy to use only compared to command-line PGP.

The complexity of PGP doesn't just make it hard to use, it makes it insecure. Consider Matthew Green's recent savaging of PGP, in which he describes all the things that go wrong because it's all so complicated. As Green says, "[t]ransparent (or at least translucent) key management is the hallmark of every successful end-to-end secure encryption system." 

The recent push for secure software is an aspect of the privacy mania resulting from Edward Snowden's revelations. A PR push for it came from Reset The Net, a campaign to make the internet NSA-resistant.

Privacy is all well and good, and in fact it's been available for internet use for decades. It's just too hard to bother with, and a PR campaign won't make it any easier. There is a long history of people proficient enough with computers to find this software usable arguing that everyone should use it. Telling a normal, non-techie human being to use PGP is unfair to that person.

Some of the newer high-security systems are easier to use. This EFF story advocates the use of TOR, an anonymous world wide web, to protect privacy. But the jury's still out on Tor if you ask me; there are many potential security problems with it. On top of that, Tor is a favorite place for the not-so-nice, such as those selling drugs and trafficking in children.

The trade-off between convenience and security is a surprisingly universal concept. It applies to programming as much as to the end-user experience on computers. The infamous Heartbleed bug is, in a sense, a result of an attempt to enhance the convenience of secure programming. Writing secure communications software often involves renegotiating and establishing secure connections. It's a pain. To make this less necessary, a standard for a TLS Heartbeat Extension was created, the periodic "heartbeat" keeping the connection open. OpenSSL implemented the code in 2012 and turned it on by default.

There's nothing inherent in the idea of TLS heartbeats that should diminish security. The problem with them is that they expand the "attack surface" of the program (especially when there's no security auditing in place, as was the case with OpenSSL). New code creates the potential for new vulnerabilities. In the immediate aftermath of the Heartbleed revelations, I witnessed some contentious discussions over this on Twitter between security experts I respect.

Not long ago I wrote about PassPoint, a new standard designed to make secure public wifi easier. PassPoint, also known as HotSpot 2.0 and a bunch of other names, shows how to do a high level of security and privacy right, by making it completely seamless to the user.

PassPoint also shows how hard it is to do security well. The standard has been in the works for years. The technical end of it isn't the problem. The hard part is that there are so many significant players involved, from software companies to broadband providers to mobile networks.

It's worth noting that much of what we and this story call security is really just privacy, which is only part of security. But effective privacy requires not just a secure architecture, like PGP, but good security in other ways throughout the software system. If the server can be hacked and the database scraped for personal information, then it doesn't matter how secure everything else was.

This is why security is so hard. It has to be done right at all levels of the system. And if it's too hard to do, people won't do it. For that reason, difficult security just isn't enough and we should demand more.

PGP_diagram
It's easy to use PGP! Image courtesy Wikimedia Commons.
Editorial standards