By bbc.com
Publication Date: 2025-11-12 00:10:00
Liv McMahonTechnology reporter
Getty ImagesThe UK government will allow tech companies and child safety charities to proactively test artificial intelligence tools to ensure they cannot create images of child sexual abuse.
A change to the Crime and Policing Act announced on Wednesday would allow “authorized testers” to test models for their ability to generate illegal child sexual abuse material (CSAM) before they are published.
Technology Minister Liz Kendall said the measures would “ensure AI systems can be made safe at source” – although some campaigners argue more needs to be done.
The Internet Watch Foundation (IWF) announced that the number of AI-related CSAM reports has doubled in the past year.
The charity, one of the few in the world licensed to actively search for child abuse content online, said it removed 426 reported pieces of content between January and October 2025.
This is an increase from 199 in the same period in 2024, it said.
Its CEO Kerry…