A woman who Elon Musk’s AI digitally undressed has told the BBC she feels “dehumanised and reduced to a sexual stereotype”. Before that an India Orders Musk’s X to Fix ‘Obscene’ Grok AI Content.
The BBC has also witnessed several instances on the social media platform X of members asking the chatbot to undress women, to make them appear in bikinis without the consent of their owners, and place them in sexual scenarios.
Grok AI, the business that operates Grok, did not respond to a request for comment – except to send a reply stating “legacy media lies.
Samantha Smith posted on X that her image (pictured) had been doctored, prompting replies from copycats — and then others requesting that Grok make more of her.

“Women are not agreeing to this,” she said.
“Do you understand how creepy this is?” Susan raged, “It’s not even me, it was images of that were getting nude, but it looked like me and also felt like me and just felt as gross and violating as if someone actually had posted a nude or bikini picture I’d taken myself.
A Home Office spokesman told the BBC: “We’re introducing legislation to ban these inappropriately intimate tools and those who place them up and under a new criminal offence, anyone selling them will face a prison sentence and hefty fines.”
Tech firms must “take risk” that they may be enabling people in the UK from accessing illegal content on their platforms, said Ofcom, but the regulator would not confirm whether it was currently investigating X or Grok over AI images.
Grok is a free AI assistant with premium paid-for features that replies to X users when they tag it in a post. It’s frequently used to respond to or add context to other posters’ comments, but people on X can also use AI to edit an uploaded image.
It has come under fire for enabling nudes and saucy content in photos and videos, and was previously blamed for creating a raunchy Taylor Swift clip.
Clare McGlynn, a professor of law at Durham University, said X or Grok “could stop these types of abuse if they wanted to” and that they “seem to act with impunity”.
“This platform has allowed the creation and distribution of these images for months with impunity, and we have seen no pushback from regulators,” she said.
So it’s probably not a great surprise that XAI’s own terms of service include language prohibiting “depictions (images or videos) of near-nude individuals engaging in sex acts.
In a statement, Ofcom said it was a crime to “create or share non-consensual intimate images or child sexual abuse material” and that yes, this did mean AI-generated sexual deepfakes were illegal.
It said platforms like X had to take “substantive measures” to limit any harm in line with what it deemed reasonable, but could not provide an accurate measure of what was acceptable.

