
Skillful, but not necessarily reliable?
Amazon
Just the other day I wondered if it would be fun to have a cuckoo clock in my kitchen.
A cuckoo clock from Amazon Alexa, I mean.
I concluded that the idea was crazy, like most things enabled for Alexa.
But we all have our prejudices and many Americans are very happy to have Amazon’s Echos and Dots scattered around their homes to make their lives easier.
Now, Alexa can even buy your mummy for you, if you want.
However, perhaps Alexa lovers should be warned that things may not be as pleasant as they seem.
Skills? Oh, everyone has skills.
A new survey of concerned academics from Germany’s Ruhr-University Bochum, along with equally concerned colleagues from the state of North Carolina – and even a researcher who joined Google during the project – can make Alexa owners wonder about the true meaning of an easy life.
The researchers analyzed 90,194 Alexa skills. What they found was a security Emmenthal that would make a rat wonder if there was any cheese there.
How much would you like to shudder, oh happy owner of Alexa?
How about this quote from Dr. Martin Degeling: “A first problem is that Amazon has partially activated the skills automatically since 2017. Previously, users had to agree to the use of each skill. Now they hardly have an overview of where the Alexa’s answer from them comes and who programmed it in the first place. “
So the first problem is that you have no idea where your smart response comes from whenever you wake Alexa up from your sleep. Or, in fact, how safe your question may have been.
Ready for another quote from the researchers? Here it is: “When a skill is published to the skill store, it also displays the name of the developer. We found that developers can register with any company name when creating their developer account on Amazon. This makes it easier for an attacker impersonate any known manufacturer or service provider. “
Please, this is the kind of thing that makes us laugh when big companies are hacked – and don’t tell us for months, or even years.
In fact, these researchers tested the process for themselves. “In an experiment, we were able to publish skills on behalf of a large company. Valuable user information can be obtained here, ”they said modestly.
This discovery was exciting, too. Yes, Amazon has a certification process for these skills. But “no restrictions are imposed on changing the backend code, which can be changed at any time after the certification process.”
In essence, then, a malicious developer can change the code and start collecting sensitive personal data.
Safety? Yes, it is a priority.
So, say the researchers, there are skill developers who publish under a false identity.
Perhaps, however, all of this sounds very dramatic. Certainly, all of these skills have privacy policies that govern what they can and cannot do.
Please sit down. From the survey: “Only 24.2% of the competencies have a privacy policy”. So three-quarters of the skills, well, don’t.
Don’t worry, there is worse: “For certain categories like ‘kids’ and’ health and fitness’, only 13.6% and 42.2% of the skills have a privacy policy, respectively. As privacy advocates, we consider ‘ children ‘and’ health-related skills must meet higher standards with regard to data privacy. “
Naturally, I asked Amazon what it thought of these rather cold discoveries.
An Amazon spokesman told me, “The security of our devices and services is a priority. We conduct security reviews as part of the skills certification and we have systems in place to continuously monitor live skills for potentially malicious behavior. All skills offenses that we identify are blocked during certification or quickly deactivated. We are constantly improving these mechanisms to further protect our customers. “
It is encouraging to know that security is one High priority. I think getting customers to have fun with Alexa’s maximum skills so that Amazon can collect as much data as possible can be a higher priority.
Still, the spokesman added: “We appreciate the work of independent researchers who help bring potential problems to our attention.”
Some might translate it as, “Damn, they are right. But how do you expect us to monitor all these little skills? We are too busy thinking big.”
Hey, Alexa. Does anyone really care?
Of course, Amazon believes that its monitoring systems work well in identifying real scoundrels. Somehow, waiting for developers to follow the rules is not the same as making sure they do.
I also understand that the company believes that children’s skills are often not attached to a privacy policy because they do not collect personal information.
To which one or two parents can murmur, “Uh-huh?”
Ultimately, like so many tech companies, Amazon prefers you to monitor – and change – your own permissions, as that would be very economical for Amazon. But who really has these monitoring skills?
This survey, presented last Thursday at the Network and Distributed System Security Symposium, is such a brutal read that at least one or two Alexa users can consider what they are doing. And with whom.
So again, do most really care? Until an unpleasant event happens, most users just want to have an easy life, having fun talking to a machine when they could easily turn off the lights themselves.
After all, this is not even the first time that researchers have exposed the vulnerabilities of Alexa’s skills. Last year, academics tried to upload 234 Alexa policy violation skills. Tell me how many were approved, Alexa? Yes, all of them.
The most recent skill researchers themselves contacted Amazon to offer some kind of “Hey, look at this”.
They say, “Amazon has confirmed some of the problems for the research team and says it is working on countermeasures.”
I wonder what skills Amazon is using to achieve this.