Site icon VMVirtualMachine.com

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

By @emanuelmaiberg
Publication Date: 2025-12-10 14:45:00

Google suspended a mobile app developer’s accounts after he uploaded AI training data to his Google Drive. Unbeknownst to him, the widely used dataset, which is cited in a number of academic papers and distributed via an academic file sharing site, contained child sexual abuse material. The developer reported the dataset to a child safety organization, which eventually resulted in the dataset’s removal, but he claims Google’s has been “devastating.”

A message from Google said his account “has content that involves a child being sexually abused or exploited. This is a severe violation of Google’s policies and might be illegal.”

The incident shows how AI training data, which is collected by indiscriminately scraping the internet, can impact people who use it without realizing it contains illegal images. The incident also shows how hard it is to identify harmful images in training data composed of millions of images, which in this case were only discovered…

Exit mobile version