As the IoT market explodes and consumers are choosing smart home devices, so do home assistants become more and more ubiquitous.
How safe is it to rely on this type of device? Let’s look at the numbers first.
There are more than 20 million Amazon Echo devices in the home of U.S. consumers.
At the moment, there is no data on how many Google Home smart speakers were sold in the U.S. What everyone should know is that The Google Assistant is available on more than 400 million devices in North America alone.
From smart speakers to Android phones, mostly every device sold in 2017 carries or is compatible with a home assistant.
So yes, the present and the future are dominated by Alexa, Google Assistant, Siri, Cortana, and Bixby, with multiple contestants entering the arena every month.
However, when it comes to the security implications of home assistants before you invest in one you should ask a few questions.
A home assistant is constantly listening to everything happening around it, in order for it to identify its wake word and accomplish a task.
Once you say the wake word, it will record your command, transmit it to cloud servers for interpretation, then do the commanded task.
While Amazon denies that the Alexa also records every sound, there are multiple ways in which the device’s security can be breached.
Indeed, Checkmarks researchers proved how easily Alexa can be turned into a surveillance device, simply by exploiting the “Reprompt feature.”
Since Cambridge Analytica, the world is moving towards tighter controls around consumers’ personal data. With the upcoming General Data Protection Regulation in Europe, social media platforms and regular websites alike are forced to disclose exactly what happens with the data. They also allow users granular control over the way their information is stored, interpreted and used.
Unfortunately, the same GDPR provisions do not seem to extend to the data captured by home assistants. Indeed, in the IoT area, the legal implications on data ownership have not been clarified.
Most, if not all home assistants record every request you make and allow you to delete that history from your account.
However, you cannot interact or delete the aggregated data stored on Amazon or Google Servers. In the case of Siri, Apple keeps the raw audio for up to 24 months.
What those companies do with the incredible amounts of interpreted data has not been debated. If the public learned on thing from the way Cambridge Analytica’s accessed millions of profiles through Facebook’s Ad Platform, it’s that companies should invest a lot more in the users’ privacy.
They should also disclose even more about the advertising practices that support those platforms.
However, the sheer volume of home assistants sold worldwide speaks volumes about the public perception. As we reported in 2017, smart homes and home assistants are one of the most popular Internet of Things applications.
It’s easy to see everyone loves Alexa.
A home assistant can help you save time by easily controlling your smart devices and can offer the best entertainment suggestions. With the huge resources being invested in their development, these appliances will only get better and more common in households around the world.
Unfortunately, there’s a catch.
“We do not guarantee that Alexa or its functionality or content (including traffic, health, or stock information) is accurate, reliable, or complete. Alexa may allow you to interact with or operate other products, such as lights, appliances, or locks, and Amazon has no responsibility or liability for such products.”
And that’s just one of the things that pose a threat.
The biggest con of a home assistant is its inherent security and privacy risks. There’s no need to quote 1984 or do a HAL-9000 reference, there are plenty of real-life examples.
Back in 2015, hackers at Black Hat conference turned a Google Nest thermostat into a spy in just 15 seconds. In 2017, multiple security researchers hacked the Amazon Key device and managed to unlock people’s homes with extreme ease.
Even with remote attacks and hacks out of the way, any person with access to your device can do a lot of damage.
As companies push hard to join all smart home devices under the umbrella of one home assistant, privacy issues mount.
Malicious hackers, state agencies, and good old-fashion robbers have an incredible gateway to your most personal information. Here’s one attack in which a personal assistant can be hijacked using ultrasonic frequencies.
While the rise of home assistants is unstoppable, legislation on user data is extremely behind with the times.
If they choose to trust a home assistant, users have to make their own research. They also must strike a personal balance between convenience and security.
The most important part of owning an Alexa, Google Echo or other home assistant is being aware of security risks and exactly what personal data they capture and share.
Security vendors have already rushed to offer their software as home assistant apps, but consumers still must take their own measures to secure their data.
For example, to protect yourself in the event of a fraudulent transaction, it’s highly recommended you do not connect a debit card to your Alexa or Echo.
That’s because the Fair Credit Billing Act holds you liable for a maximum of $50 in case of fraud while checking account fraud with debit cards expose you to full liability.
Another way to enjoy security and home automation is to rely on a home assistant that does not share data or hosts it in the cloud. Snips.ai, for example, does exactly that, storing the data offline, on the device itself.
With a few hours of your own time, you can create your own home assistant and use it on a Raspberry Pi or home device.
While security concerns are plenty and privacy seems to be in short supply, consumers are faced with the challenge of taking measures themselves. How would you secure your home assistant to protect your privacy?