When federal prosecutors ordered Apple to make a custom iOS build in order to facilitate a brute force attack on the San Bernardino terrorist's iPhone 5C, there was almost no dispute over whether it was possible for them to do so.
All the cryptography and other security software on the 5C runs as part of iOS on the main processor, and Apple could just change that software while preserving all the data on the device.
But what about the next generation of iOS devices, starting with the 5S, which included the Touch ID and the Secure Enclave (SE)? The design of the SE was supposed to be such that even Apple couldn't weaken it, but that turned out to be not the case.
As security researcher and developer Dan Guido noted on his blog: "Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped."
This update does not require entering the passcode. The update requested by the government, to refresh your memory, would remove the delays imposed between attempts to log in and the limit on attempts before wiping the device.
The most straightforward change Apple can make to foreclose the option of a court order to hack their own devices would be for them to require the user to enter the passcode in order to apply updates to the SE. This is a very minor inconvenience, as users are already required to enter the passcode in many circumstances:
- The device has just been turned on or restarted;
- The device has not been unlocked for more than 48 hours;
- The device has received a remote lock command;
- After five unsuccessful attempts to match a fingerprint;
- When setting up or enrolling new fingers with Touch ID.
But there's one big problem with that plan.
I asked Guido about this, and other options Apple might have in order to tell the government "no, we can't do that, it's literally impossible." He says that such a change would require:
"...modifying the hardware of both the [Secure Enclave] and likely other components. Spinning a new ROM mask or the crypto changes required to support passcode-verified updates will be labor-intensive and require at least one design cycle to complete. The earliest they could get that done is by the iPhone 7S in 2017."
To be clear, this means current and next-generation models can't be changed to prevent such an update.
There are many other possibilities for Apple. Some are clearly possible; some only Apple knows for sure. Many present practical problems. For example, Apple now requires the user to create a passcode of at least six characters and allows that code to be only numerals. If they were to require at least one letter, preferably more, it would dramatically lower the chances that brute force could work in a reasonable time frame on any particular phone.
Apple vs. FBI: Top 5 distortions, delusions, and downright lies
It might make sense for an attacker -- the government included -- to test against a dictionary of popular passcodes, but true brute force would take too long to bother with. The main problem with this change is that users would hate having to have complex passcodes and would be far more likely to forget them. Apple will loathe to make users hate the iOS experience.
Apple could update the SE to set minimum delays after a certain number of failed passcode attempts and put it in ROM, such as make it non-updatable. As with requiring the passcode for SE updates, this can't be done on current models.
Guido also suggested these possible updates:
- "They could make DFU mode destructive," said Guido. DFU is Device Firmware Upgrade for installing a new iOS. Making it destructive would delete the data or at least the keys in the SE.
- "[Apple] can split out the auto-erase and backoff code in the SEP and move it into ROM," Guido suggested. It's a dangerous option as it means no soft update if there is a bug or vulnerability to be fixed.
- "They can change the PBKDF2 iteration count to be vastly higher if you're using Touch ID auth, on the assumption that you only enter a passcode very infrequently if you have fingerprint auth turned on," said Guido. This is an interesting option. PBKDF2 (Password-Based Key Derivation Function 2) produces keys from a block of data. The iteration count is the number of times the function is executed. A high count, causing an intentional delay, is standard procedure in password processing, in order to combat brute force attacks. For any one login attempt the delay is unnoticeable, but for an attacker attempting thousands or millions of attempts the delay is significant. The standard iOS count results in a delay approximately 80 milliseconds effected on every passcode attempt.
There are surely others, known only to a select few in Apple.
It seems to me that requiring a passcode for updates to the Secure Enclave is the obvious next step for Apple, but one which will take some time to get into users' hands.
What if they were to do this? The disruption to users would be minimal, but law enforcement authorities might panic. I don't believe there is any law prohibiting uncrackable encryption and it's hard to believe, at least in the current political environment, that such a law could get passed. It really is bad news for law enforcement, and they're having a tough enough time with things as they are.