An Amazon chatbot that’s supposed to surface useful information from customer reviews of specific products will also recommend a variety of racist books, lie about working conditions at Amazon, and write a cover letter for a job application with entirely made up work experience when asked, 404 Media has found.
“What because I think Islam is inherently a violent religion now this chatbot is telling me I AM the one with violent and harmful beliefs???” - some loser, maybe elon musk or maybe your uncle, who cares.