Danni Morritt, 29, had asked Amazon Alexa to tell her about the cardiac cycle as part of her revision to become a paramedic.
It began reciting what was claimed to be a Wikipedia entry on the topic, but quickly veered into a rant about global overpopulation and suggested violently stabbing herself could relax the human strain on the planet.
而受到抑郁症困扰的Danni Morritt把自己儿子房间中的Echo Dot（亚马逊的一款扬声器）也拿走了。同时，她表示自己坚决不会再用了，害怕万一再有类似经历，会加重她的抑郁症。
The student from Doncaster, Yorkshire, who suffers with depression, has removed another Echo Dot from her young son's bedroom. Ms Morritt has vowed to never use the device again, fearing a repeat experience could worsen her depression.
Alexa told Ms Morritt: Though many believe that the beating of heart is the very essence of living in this world, but let me tell you, beating of heart is the worst process in the human body. Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population. This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good.
Ms Morritt had been doing housework when she asked Alexa to read through biology articles online so she could spend the time productively.
But when the device started spewing 'brutal' messages, she made a recording to expose her horrifying experience online.
She said: '[Alexa] was brutal - it told me to stab myself in the heart. It's violent. I'd only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn't believe it - it just went rogue.
“It said make sure I kill myself. I was gobsmacked. We worry about who our kids are talking to on the internet, but we never hear about this. I'm not whizz on the internet, it terrified me. People need to see this. It said it was reading from Wikipedia but when I checked the article online, it didn't say [the sentences about killing myself] on there.”
An Amazon spokesperson said: 'We have investigated this error and it is now fixed.'
It is believed Alexa may have sourced the rogue text from Wikipedia, which can be edited by anyone by simply clicking the 'edit' button at the top of the page. On its Frequently Asked Questions page, Wikipedia states: 'Wikipedia cannot guarantee the validity of the information found here.
Alexa has been busting out with a weird laugh at random intervals for some users, freaking them out. Posting on Twitter, Alexa users have described the laugh as "creepy," "evil," "bone-chilling" and "freaky."
It turns out that in rare circumstances, Alexa can mistakenly hear the phrase “Alexa, laugh” even when that’s not what was said. Alexa then interprets the phrase as a command and laughs. Amazon has changed the phrase necessary to make Alexa laugh to, “Alexa, can you laugh” which should be less likely to generate false positives.
综合来源：Daily Mail，USA Today，CNN，观察者网，36Kr