Apple's Siri records private and confidential conversations and activities on a regular basis including talk relating to medical conditions, drug deals, and sex acts.
The Guardian reports that an unnamed whistleblower has brought the situation to light, in which contractors working for the iPad and iPhone maker regularly listen in on Siri interactions as part of their job grading the voice assistant.
Staff members tasked with grading how Siri responds to commands and whether or not the correct wake word "Hey Siri" was used before a recording occurred often hear explicit recordings, which are accidentally saved when the assistant mistakenly associates a sound as the wake word.
The publication's source notes, for example, that the sound of a zipper can be misconstrued as a demand to wake up. In what the whistleblower says are "countless instances," conversations between doctors and patients, business deals, and both criminal and sexual activity have been captured by the smart assistant.
The Apple Watch, in particular, has come under fire. While many recordings captured by Siri may only be a few seconds in length, The Guardian says that the watch -- with Siri enabled -- may record up to 30 seconds.
Apple says that less than one percent of activations are sent elsewhere for grading.
"A small portion of Siri requests are analyzed to improve Siri and dictation," Apple told the Guardian. "User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
In contrast, the source says that recordings "are accompanied by user data showing location, contact details, and app data." However, identifiers appear to be masked, which would mean connecting a specific recording to a particular user would be a challenge.
It is understandable that for any company developing a voice-based service to wish to capture some data to improve the quality of interactions and to both identify and rectify mistakes in their voice recognition technologies. However, as noted by Macworld, consumers would appreciate some way to opt-out of their data being used in this manner.
The reports bring to mind past privacy issues with rival product Amazon Alexa. There have been cases when Alexa has also listened to private conversations and earlier this year it was revealed that human operators also monitor interactions for quality control purposes.
In July, Amazon confirmed that voice recordings are held with no expiry date unless customers manually remove them.
Previous and related coverage
- Bluetooth exploit can track and identify iOS, Microsoft mobile device users
- Malvertising campaign targets Apple users with malicious code hidden in images
- Apple: iPhone info requests from Chinese government have exploded
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0