It's all we're talking about today: Apple will oppose a court order that they build software to help the FBI in cracking the encryption on the San Bernardino terrorist's iPhone. It's an outrage, isn't it?
Once you look at the details and consider the facts, it's not so outrageous. Central to understanding the order is understanding the fact that the iPhone has a unique hardware ID built into the CPU that cannot be read. This ID is mixed with the user's passcode to become the encryption key. Therefore any cracking of the encryption must be done on the device itself. This is all explained well in this blog entry by security researcher and ISV Dan Guido.
The FBI's plan is to "brute force" the password by using an automated device to manually enter passcodes until one works, but iOS prevents such testing two ways (quoting Guido): iOS may completely wipe the user's data after too many incorrect PINs entries and iOS introduces a delay after every incorrect PIN entry, and the delays become substantial after a time.
So the FBI is requesting that Apple make a special version of iOS to be run on this specific iPhone (and, presumably, others subject to similar orders in the future) so that the password may be cracked. The entire process could be done in Apple's offices. I have no doubt that Apple has similar screen-tapping hardware so that they could perform all the tests as well and pass the decrypted phone image on to the FBI, although I suppose an FBI person must be present to maintain the chain of custody.
Apple is expressing concern that this version might become public and users will be at risk. Some, including my colleague David Gewirtz, share this concern. I don't.
If such a version were to leak out it would be a bad thing for sure. The same would be true if the source code to iOS leaked out, as others could then make the crackable version and use it. The same could be said for the designs of Apple's encryption hardware in newer iDevices. Are we all that worried about the iOS source code somehow leaking out?
Apple has no reason to give the software to anyone else, even within Apple. Very few people need access to it. The whole thing could be done on Apple's premises and we can reasonably assume they have adequate security for it.
Make sure to distinguish this from calls for Apple to put a facility in iPhones so that law enforcement, without Apple's involvement, can get around encryption. That's a back door. This is just complying with a valid court order.
More on Apple and Security
- Apple must help FBI access San Bernardino gunman's phone: judge
- Ransomware: How much would you pay to get your files back?
- Online security? Just let me Google that, say puzzled bosses
- Mandated encryption backdoors? Such a bad idea, says cybersecurity agency
- Adobe pulls Creative Cloud update that deleted Apple Mac data