Facebook has revealed that contractors have been paid to listen to clips of audio harvested from Messenger but plans to "pause" the program to stem public backlash.
The social networking giant made use of hundreds of staff through outsourcing to listen to and transcribe the clips. According to Bloomberg, people with knowledge of the matter said that contracted employees were not told why Facebook would need to run such a project or why the transcriptions were important.
Facebook said the contractors were tasked with checking to see if the firm's artificial intelligence-based algorithms and systems were correctly interpreting audio content.
The company added that only users who chose to permit voice chats to be transcribed in Facebook's Messenger app were impacted. Furthermore, according to Facebook, the clips were anonymized.
However, Facebook says "we paused human review of audio more than a week ago," likely due to the public backlash that the exposure of similar programs has caused.
At least one company, TaskUs, was involved in transcribing user conversations. The company worked with Facebook as a client under the codename "Prism" and would also review content believed to be in potential violation of Facebook policies.
TaskUs said that Facebook had asked the firm to stop its transcription activities over a week ago, according to the publication.
Back in July, Amazon became the first technology firm to come under fire for storing conversations between the Alexa voice assistant -- found in products including the Echo and Echo Dot smart speakers -- unless a customer chooses to manually delete voice recordings.
Amazon now plans to allow users to opt-out of the snooping.
CNET: That 4G hotspot could be a hotbed for hackers
In the same month, reports surfaced which revealed Apple's Siri voice assistant was recording private conversations on a regular basis; some of which were ending up in the hands of contractors tasked with grading the technology's interpretation of audio.
The individuals involved were asked to grade Siri's responses to commands and whether or not the system woke itself in response to the right wake word -- but in some cases, noises, including the sound of a zipper, would be misconstrued as "Hey Siri."
Clips relating to sexual activity, drug deals, and private conversations between patients and doctors were all stored. However, user IDs were not connected to the recordings.
By August, Apple decided to suspend the program. The iPad and iPhone maker said:
"While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
Google, too, ran a similar program in which clips harvested from Google Home smart speakers and the Android Google Assistant were systematically listened to. In response to questions by data protection regulators in the EU, Google suspended the project for three months to allow a review to take place. However, the temporary project pause only impacts users in European countries.
Facebook has just finalized a settlement with the US Federal Trade Commission (FTC) over past privacy and security failures impacting users and so needs to tread carefully when it comes to further potential data-harvesting scandals.
In related news this week, the company has reportedly found itself in a tug-of-war between the FTC and the FBI -- the latter of which is seeking third-party assistance in conducting mass data harvesting of user data on the platform.
Previous and related coverage
- Apple, Google: We've stopped listening to your private Siri, Assistant chat, for now
- Amazon confirms Alexa customer voice recordings are kept forever
- Apple Q2 beats estimates on record high services revenue
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0