Today, Amazon is embarrassed and red-faced now that their Alexa AI smart speaker has been caught red-handed recording a private conversation, putting it into a message and sending it to someone in the address book of an unsuspecting user, without the user knowing what happened. Oops. This is the disaster that can await us all if we choose to buy and use these AI smart speaker devices blindly. We should at least change the settings to better protect ourselves.
Over the last several years I have been warning you by writing several columns about the AI privacy issue around Amazon Alexa, Google Home, Apple HomePod and other similar AI smart speaker devices. Now with this news, you can understand the threat we all face in this new world of AI and smart speakers.
Some people heeded my warning. Others, like my own children, ignored me. They’d say, that’s just Dad worrying about everything. But now that this Amazon Alexa violation is public, I feel good there is finally proof. Finally, we can at least be aware and protect themselves.
So, will it matter? Yes, for some people. Yes, for a short period. But unfortunately, until they have been burned, I believe most users love their smart speaker technology to the extent that they will wave their privacy rights. Too many will eventually forget all about this distressing story and leave themselves exposed to this kind of preventable catastrophe.
This privacy problem is not just with Amazon Alexa or Echo. It is also a problem with every smart-speaker AI competitor including Google Home, Apple HomePod and all the others that will pop up in coming years. This is a significant and growing problem with AI.
Catching users unaware
AI smart speakers is an innovative technology that is increasingly popular. Today, roughly 40 million of us have one of these devices. The problem is very few people understand the inherent threat and how to protect themselves from that threat.
If you have an Alexa, Home or HomePod, you are taking chances with your privacy. If you are keeping it, then at least do yourself a favor and make it less likely to hurt you.
Protect your privacy — make changes to smart speakers
First, make sure the device does not have access to your smartphone or any address book. Yes, that means you can’t use it to send text messages or emails, but that’s the only way to protect yourself from this particular risk.
Second, make sure you change the settings, so the speaker cannot listen, record and send anything without your knowledge and consent. It should always check with you for permission before sharing anything with the outside world. In fact, it would be better to turn off the listening capability until you press the button to turn it on each time you use it.
Yes, this takes some of the fun out of these devices, but it’s your privacy we are talking about protecting.
Always on and always listening
The problem is that while these devices may be quietly sitting there, they are also always on and always listening. Technically, they are listening for their wake-up word and to hear the wake-up word, it has to be on — and listening.
As we move forward, we must consider what will stop these devices from simply recording everything. And what’s to stop them from spilling our private conversations into the world? Nothing if we don’t protect ourselves.
Currently, no laws preventing them from doing that which means we must protect ourselves. It’s your choice. Yes, these smart speakers are pretty cool, but you should protect yourself as best you can.
It’s your privacy at stake. Once again, you’ve been warned!
The post Kagan: I warned you about Alexa, Home, HomePod AI privacy risk appeared first on RCR Wireless News.