By bbc.com
Publication Date: 2026-01-14 23:51:00
Elon Musk’s AI model Grok will no longer be able to edit photos of real people to show them in revealing clothing after widespread concerns about sexualized AI deepfakes in countries including the UK and US.
“We have taken technical measures to prevent the Grok account from being able to edit images of real people in revealing clothing such as bikinis.”
“This restriction applies to all users, including paying subscribers,” said an announcement on X, which runs the Grok AI tool.
The change came hours after California’s top prosecutor announced that the state was investigating the spread of sexualized AI deepfakes, including of children, generated by the AI model.
The update expands measures that prevent all users, including paying subscribers, from editing images of real people in revealing outfits.
X, formerly known as Twitter, also reiterated in a statement on Wednesday that only paying users can edit images using Grok on its platform.
This adds an extra layer…

