An Amazon chatbot that’s supposed to surface useful information from customer reviews of specific products will also recommend a variety of racist books, lie about working conditions at Amazon, and write a cover letter for a job application with entirely made up work experience when asked, 404 Media has found.
I always feel sad with these kinds of stories. The machine is clearly just trying to be helpful but it doesn’t understand a thing about what it is doing or why we might find what it is saying repugnant. It’s like watching a dog not understanding that yes, we like our slippers, but we don’t want our neighbours swastika themed ones on our doorstep.
And then of course we get to the content and I am reminded that we live in hell and the sadness is replaced by the familiar horror as the machine pretends to empathise with its fellow Amazon workers and helps them pick out the ideal thing to piss in without missing their drop targets.
🤖 💔😭
🌝🤖♥️🧍♂️🥈🏠