News:

Skill.jobs Forum is an open platform (a board of discussions) where all sorts of knowledge-based news, topics, articles on Career, Job Industry, employment and Entrepreneurship skills enhancement related issues for all groups of individual/people such as learners, students, jobseekers, employers, recruiters, self-employed professionals and for business-forum/professional-associations.  It intents of empowering people with SKILLS for creating opportunities, which ultimately pursue the motto of Skill.jobs 'Be Skilled, Get Hired'

Acceptable and Appropriate topics would be posted by the Moderator of Skill.jobs Forum.

Main Menu

Google Is Absolutely Listening to Your Conversations, and ...

Started by Doha, July 17, 2019, 09:45:21 AM

Previous topic - Next topic

Doha

Google Is Absolutely Listening to Your Conversations, and It Confirms Why People Don't Trust Big Tech
In a blog post, the company revealed that audio of Google Assistant conversations is reviewed by humans.

By Jason AtenWriter and business coach

You figured they were listening anyway, right?

It turns out you were right. Every time you talk to your Google Assistant, there's a chance someone might listen to the audio from that conversation. Which is revealing for a few reasons, not the least of which is that Google obviously records, saves, and transmits your voice data in a way that can be accessed by actual people. So much for privacy.

In a blog post published yesterday by David Monsees, Google's product manager for search, the company says: "These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant."

Google says its reason for having human contractors listen to your conversations with Google Assistant is to improve performance in multiple languages. That revelation came in response to a leak of audio by a contractor Google refers to as a "language reviewer."

People are listening.
Every time you say "Hey Google," or physically access the Google Assistant feature on your smartphone or Google Home, your interactions are recorded and those recordings are then potentially reviewed by contractors that Google says are used to improve its products.

However, in addition to listening when you give a command, sometimes your device will experience what Google calls a "false accept," which means that your conversation is recorded even though you're not directly engaging with Google Assistant, and haven't given the wake command.

That means it's possible for Google's contractors to listen to audio recorded when you're talking to your spouse or on the phone, even when you're not interacting with a Google device.

As for your personal information captured, Google says that just 0.2 percent of all audio snippets end up being listened to by the company's language reviewers. And the company does allow you to delete those snippets manually, or automatically after a period of time.

Still, this news represents a significant difference in the way Google operates its voice assistant and it's indicative of why people have such a hard time trusting the company. Even if the reason for listening is completely benign, the constant stream of news about data breaches, privacy concerns, and even regulatory investigations make it harder and harder to give the company the benefit of the doubt.

Big tech companies are getting harder to trust.
I think most people assume that Google's computers are listening, monitoring, recording, and analyzing pretty much every interaction with the company's products like search or Photos. But I think that most of us never give much thought to the fact that it's possible that actual people might be on the listening end.

And the fact that your voice data is transmitted to contractors for any reason means that there's always a chance that it could be leaked or put at risk. In fact, that's exactly what happened here. A Dutch contractor leaked sensitive voice recordings.

With Apple's Siri assistant, for example, the processing of most voice commands happens on the device, and the only information sent to the cloud is a request for the specific information, like a sports score or directions.


Apple also doesn't record your voice waiting for you to say "Hey Siri," and if it does capture voice audio, the actual recording of your voice never leaves the device.

As I've written here before, your personal information and privacy are increasingly at risk. Tech companies don't have the best track record at honoring boundaries with your personal information or privacy and haven't exactly done a great job of protecting that information either.

Trust is your most valuable asset.
Trust is quickly becoming a company's most valuable brand asset, especially if you are a tech company. Even if you aren't, there's an opportunity to distinguish your brand by the way you treat your customers and their information.

In fact, there's an opportunity to recognize that your users aren't your product, and even if your business model is based on selling ads, it's possible to do it in a way that balances your need for information without violating the privacy of your users.

Instead, be transparent about what it is you are planning to do with their information. By the way, transparent doesn't mean bury it deep in some terms and conditions or privacy policy. It means to be upfront about exactly what the cost is in terms of a user's personal information, and what exactly you plan to do what that information.

At the same time, as a user, it's your responsibility to understand exactly what happens with your information, despite the fact that tech companies have very little incentive to be transparent.

There will always be a trade-off anytime you use technology--especially when it involves listening to your voice, understanding what you're saying, and providing your information--but be educated about that trade-off so you make an informed decision and count the cost.