By Paige Cockburn
Publication Date: 2025-12-29 18:48:00
If you don’t have easy access to a mental health specialist or can’t afford one, you could turn to artificial intelligence as a type of “therapist” to get you through.
AI chatbots are always available and highly empathetic; However, there is evidence that they can give general, incorrect and even harmful answers.
Shocking claims that chatbots encouraged a 13-year-old to take his own life and pressured a Victorian man to murder his own father, even giving him instructions, have raised alarm bells.
OpenAI, owner of the popular ChatGPT model, is currently facing multiple wrongful death lawsuits in the US from families who claim the chatbot contributed to harmful thoughts.
But researchers at the University of Sydney’s Brain and Mind Center have decided to do things differently.
The cohort said they worked on a smarter and safer chatbot that thinks and acts like a psychologist.
It’s called MIA, which stands for Mental Health Intelligence Agent, and the ABC was invited in for a first look to see how it…
