On Tuesday, a US district court ordered Apple to break security protections on an iPhone linked to the San Bernardino attacks, drawing the company into one of the most important legal fights of its life. The legal precedent is serious, but there’s also a real concern that even creating the software could be dangerous. In his letter to customers, Tim Cook has described the tool as "too dangerous to create," potentially undoing years of security work that protects nearly a billion phones. At the same time, the government has portrayed that software as effectively harmless — a single software update targeted to a single phone. So how dangerous is this firmware update really?
For most iPhones, the danger comes from criminals rather than feds. The lock screen is one of the biggest protections against iPhone thieves, who often have to wipe a phone entirely after it’s been stolen. If those thieves had a way to unlock the stolen phones, victims could be exposed to anything from identity theft to extortion, depending on how much sensitive data is on the stolen phone. That threat was one of the main motivations for Apple’s shift to stronger encryption in iOS 8, and any software that unravels those protections could have serious consequences for iPhone users.
Because of that threat, the FBI’s proposed system has a number of protections to make sure its passcode hack can’t be used by anyone else. Apple has to sign any automatic firmware updates before a given iPhone will accept them, and the FBI's proposed update would be coded to an individual phone. Unless the phone’s serial number matches the serial number in the code, the software simply wouldn’t install. The method proposed by the FBI is also specific to the 5c, which lacks the Secure Enclave chip that ties lockscreen protections to hardware in more recent iPhones. Still, it’s likely that if the FBI is successful, the bureau will request similar methods for cracking Enclave-equipped phones.
But while the precise software proposed by the FBI can’t be used to unlock other phones, it can still be useful to thieves. If the code fell into the wrong hands, it could potentially be reverse-engineered into a generic version, removing the code that ties the attack to a specific phone. That reverse-engineered version would still need Apple’s signature before it could be installed — something thieves are not likely to have — but that signature system would be the only thing protecting a stolen iPhone and the information inside it.
That’s a nervous-making thought for security professionals, since no single system is ever thought to be entirely impenetrable. New vulnerabilities pop up in software all the time, and for the iPhone, they can sell for as much as $1 million. iPhone security expert Jonathan Zdziarski says there’s a real concern that an undisclosed vulnerability or existing exploit could be used in a way that Apple and the FBI can’t predict. Even if the signature system isn’t broken outright, the same tricks used by the FBI’s tool could be repurposed to give malware a stronger foothold on a targeted iPhone. "It's not about just stealing one tool," says Zdziarski. "There's a lot going on in software like this, and having a direct tap into how Apple can disable functions moves [attackers] along at light speed."
There’s also the concern that the FBI’s tool may not stay limited once the legal precedent is established. Throughout the encryption debate, the FBI has emphasized the vast number of seized iPhones that Apple has refused to unlock. Yesterday, Manhattan District Attorney Cy Vance claimed to have 175 such devices in Manhattan alone. Forensics expert Robert Lee says he’s worried the volume of requests could lead agents to seek a signed, generic version of the software, which would bypass all lock screen protections if it fell into the wrong hands. "The FBI’s going to come back again and again, and finally they’re going to ask for a version of this that’s generic," says Lee. "And it’s that generic version that’s really dangerous."
In many ways, it’s confirmation of what security professionals have been saying for decades: you can’t build a backdoor without weakening security. And while Apple and the FBI have disagreed over whether it counts as a backdoor, the court-ordered software has many of the same properties. In the right hands, it’s unnoticeable; in the wrong hands, it’s a persistent point of vulnerability. If Apple loses its case and such orders become commonplace, it may be a vulnerability the entire industry will have to grapple with.