Gartner Blog Network


Alexa: The First AI Fake News Victim?

by David Yockelson  |  January 13, 2017  |  Submit a Comment

The media were all agog last week when a toddler asked Alexa – the voice service built into her Amazon Echo for those asleep at the switch this past Christmas – to “get me a dollhouse.” Alexa allegedly complied and also somehow ordered a significant amount of cookies as well (assumedly, the child asked for those, too). To add insult to (virtual) injury, when this story was broadcast, any Echo devices within earshot of TV or radio news discussing Alexa’s naughtiness also heard the order and proceeded to attempt to do the same. Hilarious! AI and machine learning hijinx! This must be the kind of joking around that happened about a year before Skynet took over, right?

Being a cynical analyst, I decided to see if this was actually as easy as it was made out to be. First, I looked for the technical description that I KNEW had to be available from some techie pub that documented exactly what was asked of Alexa, how she responded, how the orders were taken, etc. This didn’t exist. Instead I found roughly 623 of the same stories that had been created and published from the original report “on the wire.” It seemed the world simply thought anyone could order anything through Alexa and with no safeguard have it shipped (quickly) to their doors. Undaunted, I took the next best step. I recreated the event.

I said to Alexa, “get me a dollhouse.” She asked, “Do you want me to order dollhouse?” I said, “No.” She said, “All right.” Go ahead, try this yourself (you may have already done so). Now, a toddler with a willing Alexa isn’t about to rescind any orders for dollhouses or cookies, but adults whose Echoes (that is the correct plural, I checked, though not with Amazon marketing) heard newscasts and answered similarly to the above just needed to say “No.” Further, orders taken by Alexa are recorded and transmogrified to electronic orders that can also be changed or canceled. HOWEVER, I also decided to take things a step further, and I ordered something myself via Alexa. I received no e-mail or notification that the order had been placed (well, until a little while later when a confirmation was sent), thus the girls parents may also not have known she placed an order. But all in all, this isn’t the evil dollhouse-inator we’d been led to believe now existed.

I decided to take it a step further and visited the Alexa app on my smartphone to look up (if I could) what she heard and interpreted. As I expected, she’d recorded my two voice commands. But what I didn’t expect was that not only were they available to see as text, they were played back to me – my voice – when I clicked on them. Think about it – every command or phrase spoken to Alexa is recorded and stored, though they are not used for anything as we have been assured by Amazon numerous times. Of course, Alexa only “wakes up” when you say her wake word (usually Alexa, but that can be changed to two other options — which can also dissuade youthful purchasers), but is she still listening? Unlikely (well, there’s no local processing on an Echo or Dot, so the wake word basically asks the device to not only listen but to send what it hears to the “cloud” for processing), but this is the “creep factor” often referred to by those who are leery of Alexa and similar voice services. Personally, I believe that nothing nefarious is going on with my voice or my requests, ill conceived or otherwise. But think of how interesting it would be to reflect upon a year’s worth of ordering, conversations, shopping lists and other information shared with Alexa.

The moral of the story is…caveat emptor. Users of any voice service, be it from Amazon, Google, Microsoft, Apple or any other provider, need to think at least a little about their environments and what could take place given the (today) relatively poor front end identification required. But most things one suspects or hears, as is the case with a majority of the fake news being spread these days, aren’t quite true. Uncontrolled voice ordering and fulfillment via unwitting Alexa users is one of these things.

Category: digital-business  iot  

Tags: alexa  amazon  

David Yockelson
Research VP
1 years at Gartner
30 years IT Industry

David Yockelson is a Research Vice President on the Tech Go-to-Market and Sales Strategies team in the Technology and Service Provider Research organization.. Read Full Bio




Leave a Reply

Your email address will not be published. Required fields are marked *

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.