Amazon Auditors Listen to Echo Recordings, Report Says

amazon echo voice recordings

Amazon is under fire for its privacy policies after a Bloomberg report revealed that the company hires auditors to listen to Echo recordings.

Amazon is under fire after a bombshell report revealed the company employs thousands of auditors to listen to Echo users’ voice recordings.

The team exists to help improve Amazon Alexa’s comprehension of human speech, so the voice command device responds better to commands, according to a Wednesday Bloomberg report. The auditors are made up of contractors and full-time Amazon employees working in locations like Boston, Costa Rica, India and Romania.

“The team listens to voice recordings captured in Echo owners’ homes and offices,” according to Bloomberg’s report. “The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands.”

According to the report, reviewers sift through up to 1,000 audio clips per shift – listening in on everything from mundane conversations, to people singing in the shower, all the way up recordings that are upsetting and potentially “criminal,” like a child screaming for help or a sexual assault.

The auditors listened to voice recordings used after the wake word, “Alexa,” is used on the Echo devices, and pick a small number of interactions from a random set of users to annotate.

“We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesperson said in a statement to Threatpost. “We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone. We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system.”

Amazon’s privacy policy does not explicitly state that a team is listening to Alexa conversations. It does give a user the option to disable the use of voice recordings for the development of new features, according to Bloomberg.

amazon echo privacy policy

Amazon’s Echo Privacy Policy

“When you speak to Alexa, a recording of what you asked Alexa is sent to Amazon’s cloud, where we process your request and other information to respond to you,” according to Amazon’s Echo privacy policy.

Meanwhile, the recordings that are sent to the auditors do not provide a user’s full name or address – but instead an account number, the user’s first name and the device serial number, according to the report.

The Amazon spokesperson confirmed that employees do not have direct access to information that can identify the person or account as part of this workflow.

“While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can always delete their utterances at any time,” said the spokesperson.

It’s not the first time Amazon Echo devices have come under scrutiny.

In December, Amazon inadvertently sent 1,700 audio files containing recordings of Alexa interactions by a customer to a random person – and after a newspaper investigation exposed the snafu, characterized it as a “mishap” that came down to one employee’s mistake.

More malicious hacks have emerged regarding the popular voice command device – including a flaw discovered last year in Amazon’s Alexa virtual assistant that enabled researchers to eavesdrop on consumers with smart devices and automatically transcribe every word said.

Bloomberg’s report drew ire for Amazon on Twitter, with some users calling for privacy laws to rein in Amazon and other companies that have control of user data. “Until there are laws to protect user privacy in the age of mass data mining…take a sledgehammer to your Alexa,” said one such criticizer, video game designer Mark Kern, on Twitter.

The incident also raises questions about the privacy risks around other voice services and connected devices in the home.

In February, Google came under fire after a microphone was discovered in its Nest devices (Google called the microphone an “error”). In September 2018, researchers devised a proof of concept that gives potentially harmful instructions to popular voice assistants like Siri, Google, Cortana, and Alexa using ultrasonic frequencies instead of voice commands. And in November 2018, security firm Armis disclosed that Amazon Echo and Google Home devices are vulnerable to attacks through the over-the-air BlueBorne Bluetooth vulnerability.

Don’t miss our free Threatpost webinar, “Data Security in the Cloud,” on April 24 at 2 p.m. ET.

A panel of experts will join Threatpost senior editor Tara Seals to discuss how to lock down data when the traditional network perimeter is no longer in place. They will discuss how the adoption of cloud services presents new security challenges, including ideas and best practices for locking down this new architecture; whether managed or in-house security is the way to go; and ancillary dimensions, like SD-WAN and IaaS.

Suggested articles

Hey Alexa, Who Am I Messaging?

Research shows that microphones on digital assistants are sensitive enough to record what someone is typing on a smartphone to steal PINs and other sensitive info.