Google today defended its practice of having workers listen to users’ Google Assistant queries, following the leak of 1,000 voice recordings to a media outlet. Google also said it will try to prevent future leaks of its users’ voice recordings.
VRT NWS, a news organization run by a public broadcaster in the Flemish region of Belgium, said it “was able to listen to more than a thousand [Google Assistant] recordings” that it received from a Google subcontractor.
Google Assistant is used on Google Home smart speakers, Android devices, and Chromebooks.
“In these recordings, we could clearly hear addresses and other sensitive information,” the VRT article said. “This made it easy for us to find the people involved and confront them with the audio recordings.”
VRT said it “let ordinary Flemish people hear some of their own recordings” and that these people confirmed that the recordings contained their voices.
Google Home is supposed to record only when users say the “OK Google” or “Hey Google” trigger phrases. But VRT NWS said that 153 of the 1,000 recordings it listened to “were conversations that should never have been recorded and during which the command ‘OK Google’ was clearly not given.” Recorded voices leaked to VRT included “bedroom conversations, conversations between parents and their children, but also blazing rows and professional phone calls containing lots of private information.”
Google: Leak violated data-security policy
Google responded to the VRT story in a blog post today.
“We just learned that one of [our] language reviewers has violated our data-security policies by leaking confidential Dutch audio data,” Google said. “Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
Google has previously disclosed that it hires language experts to listen to recordings, and it defended the practice in today’s blog post.
“As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language,” Google wrote. “These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology and is necessary to creating products like the Google Assistant.”
We asked Google today if its internal employees also listen to the recordings. A company spokesperson answered “yes” and added that “we apply a wide range of safeguards to protect user privacy throughout the entire review process (both internally and with our affiliates).”
Users can disable saving of voice activity
Amazon, Apple, and Google all have workers listen to smart-assistant recordings, Bloomberg wrote in April. Google acknowledged to Business Insider at the time that “We conduct a very limited fraction of audio transcription to improve speech-recognition systems.”
Amazon recently confirmed that it stores Alexa conversations until customers delete them.
Google’s blog post today said it uses “a wide range of safeguards to protect user privacy throughout the entire review process.”
Google users can disable the saving of voice activity and other types of personal information at Google’s activity controls site, where they can also delete past recordings. More information on how to manage and delete Google Assistant data can be found at this Google help page.
“Language experts only review around 0.2 percent of all audio snippets,” Google said. “Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises and only to transcribe snippets that are directed to Google.”
The company also said that Google Assistant “only sends audio to Google after your device detects that you’re interacting with the Assistant.” But Google acknowledged that sometimes its software “misinterprets noise or words in the background,” leading to “false accepts” in which people’s voices are recorded when they aren’t trying to use Google Assistant.
Google told Ars that the storing of voice and audio activity is set to “off” by default when people create Google accounts.
“You must opt in to have your audio recordings stored to your account, and Voice and Audio Activity is not required to use the Google Assistant,” Google said. “We disclose that Voice & Audio Activity (VAA) can be used to improve speech systems and in Google’s privacy policy, we also explain that we provide personal information to trusted businesses to process for us. If you do opt in, you can set your account to auto-delete Assistant history after every 3 months or every 18 months. Or you can manually delete it yourself.”