By Emily Kowal
Publication Date: 2026-03-23 08:30:00
Children and young people who turn to AI companion chatbots for relationships are vulnerable to sexually explicit content and encouraged to self-harm or commit suicide, according to a new report from the eSafety Commissioner.
A groundbreaking transparency report from the eSafety Commissioner found that almost 80 percent of Australian children and young people use popular AI companion bots, but these services do not protect children from sexually explicit content and do not do enough to prevent users from creating child sexual exploitation and abuse material.
In October, the eSafety Commissioner asked four of the most popular companion chatbot platforms – Character.AI, Chub AI, Nomi and Chai – to explain how they protect children from exposure to…
