Apple Suspends Siri Program After Privacy Backlash

Apple’s Siri follows Amazon Alexa and Google Home in facing backlash for its data retention policies.

Apple is suspending a program that lets contractors listen in on Siri voice recordings after facing a rain of backlash regarding the privacy implications of the program.

The suspension comes after a report in The Guardian last week outlining how contractors regularly listen to intimate voice recordings – including drug deals or recordings of couples having sex – in order to improve audio accuracy, a process that Apple calls “grading.”

On the heels of the report, which drew ire from privacy advocates, Apple said that it will take steps in the future to ensure that users have the power to opt out of the program.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to media. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.” 

In last week’s bombshell story, a former contractor for Apple detailed how the process of grading revealed Siri users’ private audio snippets. While the clips are anonymous and do not connect to names or IDs of individuals, they can contain private information and sensitive information such as financial data.

The processes have increased scrutiny around how much data is being collected by voice assistants, what that data is, how long it’s being retained and who accesses it. Amazon recently found itself in hot water over a similar employee-listening and recording issue, for instance.

“The biggest concern that I have is actually around data retention policies and disclosure,” said Tim Mackey, principal security strategist at the cybersecurity research center at Synopsys, speaking to Threatpost in a recent podcast on voice assistant privacy issues. “So we have an expectation that these are connected devices, and that perhaps short of the Alexa-then-perform-action activity, that the communication, the actual processing of our request is going to occur on an Amazon server, Google server or so forth…. And what we’re learning is that the providers tend to keep this data for an indeterminate amount of time. And that’s a significant risk, because the volume of data itself means that it’s potentially very interesting to a malicious actor someplace who wishes to say, target an individual.”

[Listen to Threatpost’s podcast on voice assistant data collection issues or download directly here.] 

According to Apple’s privacy page, Siri queries that go through servers are protected by end-to-end encryption so data is anonymous and cannot be attributed to an individual user. User voice recordings are saved for a six-month period so that the recognition system can utilize them to better understand the user’s voice.

According to the privacy page, after six months, another copy is saved, without its identifier, for use by Apple in improving and developing Siri for up to two years. A small sub-set of recordings, transcripts and associated data without identifiers may continue to be used by Apple for ongoing improvement and quality assurance of Siri beyond two years.

Both Google Home and Amazon Alexa have faced their own backlash surrounding the process of contractors listening into recordings for audio accuracy: In April, Amazon was thrust into the spotlight for a similar reason, after a report revealed the company employs thousands of auditors to listen to Echo users’ voice recordings. The report found that Amazon reviewers sift through up to 1,000 Alexa audio clips per shift – listening in on everything from mundane conversations to people singing in the shower, and even recordings that are upsetting and potentially criminal, like a child screaming for help or a sexual assault. Amazon is also facing lawsuits from parents that say Alexa records their children without consent.

And in July, Google Home smart speakers and the Google Assistant virtual assistant have been caught eavesdropping without permission — capturing and recording highly personal audio of domestic violence, confidential business calls — and even some users asking their smart speakers to play porn on their connected mobile devices.

Privacy will be a focus next week at Black Hat 2019, taking place Aug. 7 and 8 in Las Vegas. Be sure to follow all of our Black Hat and DEF CON 27 coverage right here in Threatpost’s special coverage section.

Suggested articles

Hey Alexa, Who Am I Messaging?

Research shows that microphones on digital assistants are sensitive enough to record what someone is typing on a smartphone to steal PINs and other sensitive info.