Facebook Records User Audio, Sparking Privacy Questions

facebook records audio chats

Hundreds of contractors reportedly were hired to transcribe Messenger voice chats in order to test the accuracy of an AI algorithm — raising questions about what Facebook does with the data.

Facebook has admitted that it has been transcribing audio chats between its users on its Messenger platform. Sources said that it’s paying hundreds to third-party outside contractors to do so. The latter calls into question Facebook’s data-handling practices when it comes to being open with its users.

While Facebook confirmed that it had been transcribing users’ audio, it maintains that affected users chose to have their voice chats transcribed. In answering questions related to a Congressional probe last year, the company said that it “only accesses users’ microphone if the user has given our app permission and if they are actively using a specific feature that requires audio (like voice messaging features).”

It also said this week that it is halting the program – which is meant to train its algorithm to be more accurate – to review the privacy implications.

“Much like Apple and Google, we paused human review of audio more than a week ago,” the social network said Tuesday in a statement to media. One of the firms reviewing the user chat logs, TaskUs, confirmed that to be the case: “Facebook asked TaskUs to pause this work over a week ago, and it did,” a spokesperson told Bloomberg.

However, the bigger issue may be that Facebook lacks transparency when it comes to communicating to users how it uses the audio transcriptions, how long it keeps them and who might have access to them, including third parties.

Facebook’s privacy policy says only that the tech giant will collect “content, communications and other information you provide” when users “message or communicate with others.”

It also says, “systems automatically process content and communications you and others provide to analyze context and what’s in them” – without mentioning a human review process or a transcription team. It does say that it shares information with “vendors and service providers who support our business,” while giving no specifics — a common tactic to provide a sort of data-handling loophole, according to privacy experts.

“It’s vague language by design, and [these companies use] ambiguity to ensure they can do whatever they want with your data,” Sean McGrath, editor of ProPrivacy.com, told Threatpost.

The messages are anonymized, but the contract employees weren’t told where the audio was recorded or how it was obtained, sources told Bloomberg, who also said that the contractors numbered in the hundreds.

Facebook could find itself afoul of the General Data Protection Regulation (GDPR) in Europe due to this lack of clear data processing policies. The Irish Data Protection Commission, which oversees Facebook’s privacy behavior in Europe, said it was examining the situation for GDPR violations. The regulatory concerns are more of the same for Facebook, which just agreed to a $5 billion settlement with the U.S. Federal Trade Commission after a probe of its privacy practices.

Facebook isn’t alone in running into problems over AI training. Amazon, Apple and Google have all landed in hot water over the way they collect audio clips from users and use humans to review them.

Earlier in August for instance Apple said that it would suspend a program that lets contractors listen in on Siri voice recordings, after a report outlined how contractors regularly listen to intimate voice recordings – including drug deals or recordings of couples having sex – in order to improve audio accuracy, a process that Apple calls “grading.”

In April, Amazon came under fire after a report revealed that the company employs thousands of auditors to listen to Echo users’ voice recordings. In July, Amazon acknowledged that it retains the voice recordings and transcripts of customers’ interactions with its Alexa voice assistant indefinitely; and in June, two lawsuits were filed seeking class-action status, alleging that Amazon records children and also stores their voiceprints indefinitely.

Google meanwhile in July was caught out after it emerged that Google Home smart speakers and the Google Assistant virtual assistant have eavesdropped without permission — capturing and recording highly personal audio of domestic violence, confidential business calls and even some users asking their smart speakers to play porn on their connected mobile devices.

Suggested articles