Apple’s head of software engineering told law enforcement and the government via a Washington Post op-ed on Sunday that a precedent-setting backdoor into the iPhone threatens to turn back the clock on mobile security to less safe times.
The column, written by Craig Federighi and posted last night, argues that the removal of security features in order to unlock San Bernardino shooter Syed Farook’s phone disables years of research and protection for consumers and business users alike.
“They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers,” Federighi wrote. “What’s worse, some of their methods have been productized and are now available for sale to attackers who are less skilled but often more malicious.”
While Federighi’s op-ed trended toward the rational and cogent, San Bernardino district attorney Michael A. Ramos filed an amicus brief last week that has drawn the ire of iOS security specialists.
In his brief, Ramos contends among other things that the while the phone was connected to the county’s computer network—it was a county iPhone 5c issued to Farook—it could have been used as a weapon and introduced “a lying-dormant cyber pathogen” threatening the network.
“Ramos’s statements are not only misleading to the court, but amount to blatant fear mongering,” said iOS forensics expert Jonathan Zdziarski in a rebuttal posted on his website Sunday. “They are designed to manipulate the court into making a ruling for the FBI, and in my opinion are egregious enough that Ramos should be held in contempt just for filing what amounts to a crazy apocalypse story.”
Zdziarski dismantles Ramos’ contention, primarily by pointing out that Ramos’ claim that the malware is dormant indicates that they have no evidence it exists on the device, or the network.
“On the one hand, if the county has identified the malicious code on their network, then there’s no need to get into the phone at all, because they’ve already got an unencrypted code sample that they can disassemble and reverse engineer, like malware researchers and teenagers all over the world do,” Zdziarski wrote. “On the other hand, if they haven’t identified it on the network, then that means there was never any infection at all, or that their security people are too incompetent to find it, leaving their network infected with an infection that hasn’t actually shown any evidence of itself.”
This is the latest twist in the Apple-FBI dustup over unlocking the alleged terrorist’s phone. The FBI won a court order in February compelling Apple to help the agency unlock the phone by developing a firmware update that would bypass or remove security features preventing an offline brute-force attack against the device’s passcode.
The two sides have gone back in forth in court with both sides either trying to get the order enforced or dropped. Both sides have also filed separate amicus briefings, the latest on Apple’s side coming from a host of iOS experts on Friday asking the courts to vacate their order against Apple. The brief reinforces Apple’s argument that the order endangers public safety and sets a precedent where the government and law enforcement could request similar access to other phones, and not just the one phone as is stated in the order.
“The encryption technology built into today’s iPhone represents the best data security available to consumers,” Federighi said. “And cryptographic protections on the device don’t just help prevent unauthorized access to your personal data — they’re also a critical line of defense against criminals who seek to implant malware or spyware and to use the device of an unsuspecting person to gain access to a business, public utility or government agency.”