X
Tech

Experts rip Ray Ozzie's plan for unlocking encrypted phones

The former Microsoft executive's idea collapsed in the face of expert scrutiny.
Written by Zack Whittaker, Contributor
bank-vault.jpg

(Image: file photo)

It's 2018, and we're still talking about cryptographic backdoors. It's the bad idea that just won't die.

But don't worry: A former Microsoft executive thinks he can beat a dead horse with a not-so-new idea that took about a day for everyone else to rip apart.

Ray Ozzie, who most recently served as Microsoft's chief software architect, revealed his idea to let police and law enforcement into encrypted phones and devices in a lengthy Wired article.

Device encryption has long been a headache for police and feds who want to access data on locked devices. Law enforcement agencies have pushed tech companies to fix the problem -- even by way of suing them. But security experts say that adding a secure backdoor to devices can't be done, because they can be either hacked or abused.

As things stand now, phone makers have cut themselves out of the loop by allowing phone owners to encrypt their devices with passcodes so even the manufacturers can't get in.

Ozzie's idea, which he called Clear, is a system that requires phone makers to intentionally put themselves back in the loop by using a system known as key escrow.

Here's how his idea works: A phone maker, like Apple or Google, will generate a public and private key for every phone. The public key goes into the phone, and the private key goes into a secure vault at the phone maker's headquarters. When the phone owner sets the passcode, it encrypts the passcode using the phone's public key. If the police ever need access to the phone, they can obtain a warrant, and then boot the phone up using some kind of police-only recovery mode, which will deliberately brick the phone to prevent anyone else from gaining access to its data. The police can then obtain the user's encrypted passcode. The police will give that encrypted passcode to the phone maker, which will deliver the phone's corresponding private key from their vault. When the public and private key are paired, the phone's passcode will be decrypted, allowing access to the phone.

Key escrow is a nice idea, except it's been done before and has never worked -- largely because laws can easily usurp the system.

Wired's article was light on the criticism, but the rebuke from security experts and cryptographers was heavy.

Rob Graham, chief executive of Errata Security, said in a post that Ozzie's idea is "only solving the part we already know how to solve" and "deliberately ignoring the stuff we don't know how to solve."

"We know how to make backdoors, we just don't know how to secure them," he said.

Matthew Green, a cryptography professor at Johns Hopkins, said in a blog post that "the reason so few of us are willing to bet on massive-scale key escrow systems is that we've thought about it and we don't think it will work."

"Ozzie's proposal relies fundamentally on the ability of manufacturers to secure massive amounts of extremely valuable key material against the strongest and most resourceful attackers on the planet," he said.

"And not just rich companies like Apple," he said. "We're also talking about the companies that make inexpensive phones and have a thinner profit margin. We're also talking about many foreign-owned companies like ZTE and Samsung."

As much as there are substantial technical reasons why Ozzie's idea wouldn't work, the policy reasons are just as damning.

Riana Pfefferkorn, a cryptography fellow at Stanford Law School, who earlier this year wrote a whitepaper on "responsible encryption," said that the system could threaten civil liberties and human rights.

In the US, law enforcement can force you to use your fingerprint or scan your face to access and search your phone. These biometric seizures happen more often than you might realize. But you cannot be legally compelled to unlock a device with a just a passcode, because the Fifth Amendment protects what's stored in your head, but not what's on your body.

"If Ozzie's proposal were implemented, it would give the police a way to lean on you to open the phone for them," she said in a blog post. "Ozzie's scheme would basically require a self-destruct function in every smartphone sold, anywhere his proposal became law, that would be invoked thousands and thousands of times per year, with no compensation to the owners."

"That proposal does not deserve to be taken seriously -- not in a democracy, anyway," she said.

Even technical and policy issues notwithstanding, Ozzie's scheme would have to work bilaterally -- and include countries where popular phone makers are based.

As Graham noted that technology is borderless but laws and human rights are not. The system could be used to help countries like China or Russia, where human rights are few and far between, get access to anyone's phone.

"A solution in the United States that allows 'legitimate' law enforcement requests will inevitably be used by repressive states for what we believe would be 'illegitimate' law enforcement requests," said Graham.

Steve Bellovin, a researcher at Columbia University, came to a similar conclusion in a blog post.

"Maybe there's a solution, maybe not -- but the proposal is silent on the issue," he said.

Ozzie tweeted -- following Green's criticism -- that the "central question is one of policy, and I hope that can come more loudly to the forefront."

But the central question isn't one of policy, it's one that's technical -- and technologists have long said there is no such thing as a "secure backdoor," an oxymoron in the cryptography community.

Until politicians and law enforcement accept that, we'll keep coming up with bad ideas like Ozzie's.

Editorial standards