Amazon is using a team of human staff to eavesdrop on queries made to Amazon Alexa-enabled smart speakers in a bid to improve the voice assistant's accuracy, a new report suggests.
If you check out your Amazon Echo smart speaker's history via the Alexa app (Alexa account - > History), depending on where and when you use the device, you may see little more than general, genuine queries.
My history is full of cooking timer requests, light control commands, and news briefings.
There are also a few nonsense recordings generated by the nearby television on record -- including a man talking about his dog and politics mentioned once or twice -- and while they may be seen as acceptable recording errors, the idea of an unknown human listening in may be enough to make you uneasy.
According to Bloomberg, this may be the case, as Amazon staff in areas including Boston, Costa Rica, India, and Romania are listening in to as many as 1,000 audio clips per day during nine-hour shifts.
See also: US regulators dash Amazon hopes to stop investor vote on gov't facial recognition tech sales
While much of the work is described as "mundane," such as listening in for phrases including "Taylor Swift" to give the voice assistant context to commands, other clips captured are more private -- including the example of a woman singing in the shower and a child "screaming for help."
Recordings sent to the human teams do not provide full names, but they do connect to an account name, device serial number, and the user's first name to clips.
Some members of the team are tasked with transcribing commands and analyzing whether or not Alexa responded properly. Others are asked to jot down background noises and conversations picked up improperly by the device.
"The teams use internal chat rooms to share files when they need help parsing a muddled word -- or come across an amusing recording," Bloomberg says.
CNET: Amazon Go stores plan to start accepting cash
In some cases, however, the soundbites were not so amusing. Two unnamed sources told the publication that in several cases they picked up potentially criminal and upsetting activities, accidentally recorded by Alexa.
An Amazon spokesperson said in an email that only "an extremely small sample of Alexa voice recordings" is annotated in order to improve the customer experience.
"We take the security and privacy of our customers' personal information seriously," the spokesperson added. "We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system."
TechRepublic: Why MongoDB isn't worried about AWS
It is possible to withdraw from these kinds of programs for the benefit of your personal privacy. In order to do so, jump into the Alexa app and go to Alexa Account - > Alexa Privacy - > "Manage how your data improves Alexa."
In this tab, you can toggle various options including whether or not you permit your Alexa usage to be used to "develop new features," and whether messages you send with Alexa can be used by Amazon to "improve transcription accuracy."
In related news, the Intercept reported in January that the Amazon-owned company provided its Ukraine-based research and development team close to "unfettered" access to an unencrypted folder full of all the video footage recorded by every Ring camera worldwide. Some employees had access to a form of 'god' mode which permitted 24/7 access to customer camera feeds.