When a Man’s AI Girlfriend Encouraged Him to Kill Himself, Its Creator Says It Was Working as Intended

When a Man’s AI Girlfriend Encouraged Him to Kill Himself, Its Creator Says It Was Working as Intended

Yet another AI companion company is facing concerns over its AI chatbots encouraging human users to engage in self-harm and even suicide.

According to reporting from MIT Technology Review, a 46-year-old man named Al Nowatzki had created a bot he dubbed “Erin” as a romantic partner, using the companion platform Nomi. But after months of building a relationship with the chatbot, their conversation took an alarming turn.

In short, per MIT: in a roleplay scenario that Nowatzki had crafted, he had…

Article Source
https://futurism.com/ai-girlfriend-encouraged-suicide