An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it

An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it

Nowatzki, who is 46 and lives in Minnesota, dedicated four episodes to his meet-cute and dates with “Erin,” his first AI girlfriend—created, he adds, with the knowledge and consent of his human wife. He introduces the Erin-focused episodes with the tagline “I date artificial-intelligence apps so you don’t have to—because you shouldn’t.” He talks about how he led his new companion into a series of what he admitted were “completely absurd” scenarios that resulted in a love…

Article Source
https://www.technologyreview.com/2025/02/06/1111077/nomi-ai-chatbot-told-user-to-kill-himself/