An introductory letter from CEO Tim Cook also reaffirmed the company’s stance of refusing to build in backdoors for government and law enforcement.
“I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will,” Cook wrote. “We also refuse to add a backdoor into any of our products because that undermines the protections we’ve built in. And we can’t unlock your device for anyone because you hold the key — your unique password. We’re committed to using powerful encryption because you should know the data on your device and the information you share with others is protected.”
Apple’s decision to relinquish control of its encryption keys to users lit the fuse to the government’s “Going Dark” crypto issue, which pits Apple and Google against law enforcement. The government contends that by Apple not having direct access to user data, and thus cannot be compelled via warrant or National Security Letter, to turn over content, the decision puts law enforcement at a disadvantage. Apple—and security and privacy experts—contend that the decision protects not only user privacy, but also human rights for those in areas run by sanctioned governments or oppressive leaders.
“Apple has really doubled down on privacy,” said Matthew Green, crypto expert and professor at Johns Hopkins University. “This isn’t something they’re just using for a temporary advantage. They’ve actually spent a significant amount of effort thinking through the privacy and security implications of every part of their platform.”
The approach-to-privacy page is a primer on encryption, and how Apple products and services make use of it to secure data. The page explains how Apple Pay secures transaction data, for example, pointing out that payment card data is not stored on the device, nor on Apple’s servers. The encryption of iMessage and FaceTime is also laid out in plain English, explaining how sessions are encrypted and cannot be accessed by Apple or anyone else without the user’s passcode. Protections for Safari, iCloud and health care data are also laid out on this page.
Users are also give clear instructions on how to use privacy and security features on their devices and Apple’s sundry services. Starting with passcodes, Touch ID and the Find My iPhone and Mac features to choosing strong passwords, recognizing phishing attacks and private web browsing, this page is a standard other companies may be forced to follow, experts said.
“I do think the clarity sets a precedent, or at least I hope it does. This is all about allowing consumers to make informed decisions,” said Rich Mogull, founder of security consultancy Securosis. “Apple sees privacy as something consumers care about, and a business differentiator for them. And after talking with various Apple employees and representatives over the years, it’s clear this is the real deal and they take it very seriously internally.”
In its transparency report, Apple spells out its requirements for warrants before handing over device or account data to law enforcement. The report says a small fraction of requests seek content from iCloud accounts, for example.
From the transparency report page:
“On devices running iOS 8 and later versions, your personal data is placed under the protection of your passcode. For all devices running iOS 8 and later versions, Apple will not perform iOS data extractions in response to government search warrants because the files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess.”
Apple said 94 percent of the law enforcement requests it receives regard assistance in finding a stolen device; the remaining requests are account requests around information stored in an iCloud account. As for National Security Letters, Apple said in the first six months of this year, it received between 750 and 999 requests.