Google employees reportedly listening in on all user conversations via Google Assistant, company respondsHARDWARE NETWORKING LINUX SOFTWAREIt Tech Technology

It Tech Technology

COMPUTER HARDWARE NETWORKING

Breaking

Home Top Ad

Post Top Ad

Friday, July 12, 2019

Google employees reportedly listening in on all user conversations via Google Assistant, company responds

Google is reportedly listening-in on all your conversations, including ones which are not meant to be recorded. This happens via the company's AI-powered Google Assistant on phones and Google Home speakers, Belgian news portal VRT NWS claims. The news platform agrees that Google, in its terms and conditions, states everything said by users to their Google smart speakers and Google Assistant is being recorded and stored. However, it doesn't mention its employees can listen to excerpts from these recordings.

“It is true that Google does not eavesdrop directly, but VRT NWS discovered that it is listening in. Or rather: that it lets people listen in,” the news portal states. The report also adds that when people listened to their own recorded audio clips, they were surprised to find that their conversations being recorded and accessed by employees. In those recordings, “we could clearly hear addresses and other sensitive information,” which made it easier for the Belgian news portal to find affected users.

Meanwhile, Google has accepted that it listens to the conversations and has provided an in-depth explanation of what it does. “As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant,” Google explains.

The Mountain View-based company says that Google Assistant sends back audio only after a device detects the user as saying 'Hey Google', or physically triggering the Google Assistant, to interact with it. When the assistant is summoned, the device is said to provide a clear indicator (flashing dots on top of a Google Home or an on-screen indicator in case of an Android device) that a conversation has started.

However, VRT NWS shortlisted a few conversations and said that out of “more than a thousand excerpts,” 153 conversations should never have been recorded because the user didn't say the ‘Ok Google’ command. For those unaware, users with Google Assistant on their phones and smart speakers have to speak “Ok, Google” to start a conversation with the AI-powered virtual assistant. 

Since users didn't call up the virtual assistant, multiple sensitive user conversations were reportedly recorded unintentionally. As per the report, these include bedroom conversations, chats between parents and their children, along with phone calls containing a slew of sensitive private information. In addition, there is said to be a woman's recording who was apparently under definite distress. This fiasco raises questions about Google respecting its user's privacy.

Google clarifies that “rarely”, Google Assistant built devices may experience a “false accept.” It means that either some noise or words in the background was interpreted to be the hotword to invoke the assistant. Google says that it also has “a number of protections” in place to prevent false accepts.

“We apply a wide range of safeguards to protect user privacy throughout the entire review process. Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google,” David Monsees, Product Manager, Search, said in a statement.

Not all is as bad as it looks though. Google reportedly tries to ensure that the voice excerpts are not being linked to a user account to make it difficult to track someone's identity. The company is said to delete the user name and replace it with an anonymous serial number. “It doesn’t take a rocket scientist to recover someone’s identity; you simply have to listen carefully to what is being said. If they don’t know how it is written, these employees have to look up every word, address, personal name or company name on Google or on Facebook. In that way, they often soon discover the identity of the person speaking,” the report by the news portal states.

As far as the conversations accessed by VRT NWS are concerned, Monsees says that one of the language reviewers has violated its data security policies by leaking confidential Dutch audio data. “Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again,” the executive added. Google also reiterated that users can turn off storing audio data to their Google accounts completely, or choose to auto-delete data after every 3 or 18 months.



from Latest Technology News https://ift.tt/32rdg6q

Post Bottom Ad