An Amazon chatbot that’s supposed to surface useful information from customer reviews of specific products will also recommend a variety of racist books, lie about working conditions at Amazon, and write a cover letter for a job application with entirely made up work experience when asked, 404 Media has found.
What’s up with Elaine’s change of tone? She was saying the condoms were great until Kramer came in, and then switched to saying she had a bad experience.
It’s because the AI has no idea what reality is. Note also that Elaine’s praise took on the perspective of the wearer.