Apple Updates Privacy Policies After Siri Audio Recording Backlash

Apple’s “grading” process, which listens to Siri voice recordings, will now be in-house and has an option for users to opt out.

Apple is taking steps to improve the privacy of audio collected by its Siri voice assistant, on the heels of backlash around a program that let contractors listen into Siri conversations.

On Wednesday, the phone giant apologized for violating users’ privacy through the program, which was implemented in order to improve audio accuracy, a process that Apple calls “grading.” Moving forward, Apple said that customers will now have the choice of opting out of the “grading” process, and that it will move the program in-house (versus before, when it was hiring third-party contractors to listen into the audio).

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” said Apple in a Wednesday post. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

Earlier in August, Apple suspended the program after a former contractor for Apple detailed in The Guardian how contractors regularly listen to intimate voice recordings – including drug deals or recordings of couples having sex. While the clips are anonymous and do not connect to names or IDs of individuals, they can contain sensitive information such as financial data, leading to a rain of backlash regarding the privacy implications of the program.

Apple on Wednesday said that the program will resume in the Fall with new privacy-focused improvements.

For one, users will have the option to opt out of the program, so samples of their audio requests will not be included in the “grading” process.”

Apple will also no longer retain audio recordings of Siri interactions by default, and instead rely on computer-generated transcripts for efforts to improve Siri (for those who do opt out of the program). Audio recordings will be stored only for users who opt in to grading – They will not be stored by default unless users opt in.

And, if users choose to opt in to the program, only Apple employees – not contractors – will be allowed to listen to Siri interaction audio samples, Apple said.

“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” Apple said. “Those who choose to participate will be able to opt out at any time.”

At the same time, according to a report in The Guardian, Apple has also ended contracts for hundreds of workers hired as part of the “grading” program.

The processes have increased scrutiny around how much data is being collected by voice assistants, what that data is, how long it’s being retained and who accesses it. Amazon recently found itself in hot water over a similar employee-listening and recording issue, for instance.

[Listen to Threatpost’s podcast on voice assistant data collection issues or download directly here.] 

Both Google Home and Amazon Alexa have faced their own backlash surrounding the process of contractors listening into recordings for audio accuracy: In April, Amazon was thrust into the spotlight for a similar reason, after a report revealed the company employs thousands of auditors to listen to Echo users’ voice recordings. The report found that Amazon reviewers sift through up to 1,000 Alexa audio clips per shift – listening in on everything from mundane conversations to people singing in the shower, and even recordings that are upsetting and potentially criminal, like a child screaming for help or a sexual assault. Amazon is also facing lawsuits from parents that say Alexa records their children without consent.

And in July, Google Home smart speakers and the Google Assistant virtual assistant have been caught eavesdropping without permission — capturing and recording highly personal audio of domestic violence, confidential business calls — and even some users asking their smart speakers to play porn on their connected mobile devices.

Apple did not immediately respond to a request for comment from Threatpost.

 Interested in more on the internet of things (IoT)? Don’t miss our free Threatpost webinar, “IoT: Implementing Security in a 5G World.” Please join Threatpost senior editor Tara Seals and a panel of experts as they offer enterprises and other organizations insight about how to approach security for the next wave of IoT deployments, which will be enabled by the rollout of 5G networks worldwide. Click here to register.


Suggested articles