Grok AI being used to ‘digitally remove women’s clothing’

A woman has told the BBC she felt “dehumanised and reduced into a sexual stereotype” after Grok was used to digitally remove her clothing. The BBC has seen several examples on the social media platform X of people asking the chatbot to undress women to make them appear in bikinis without their consent, as well as putting them in sexual situations. XAI, the company behind Grok, did not respond to a request for comment, other than with an automatically-generated reply stating “legacy media lies”. Samantha Smith shared a post on X about her image being altered, which was met with comments from those who had experienced the same — before others asked Grok to generate more of her. “While it wasn’t me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me,” she said. The regulator Ofcom said tech firms must “assess the risk” of people in the UK viewing illegal content on their platforms, but did not confirm whether it was currently investigating X or Grok in relation to AI images. The proliferation of AI image-generating platforms since the launch of ChatGPT in 2022 has raised concerns over content manipulation and online safety across the board. It’s also contributed to an increasing number of platforms that have produced deepfake nudes of actual people. In a related development, Grok on Friday blamed lapses in safeguards had resulted in “images depicting minors in minimal clothing” on social media platform X and that improvements were being made to prevent this. Screenshots shared by users on X showed Grok’s public media tab filled with images that users said had been altered when they uploaded photos and prompted the bot to alter them. “There are isolated cases where users prompted for and received AI images depicting minors in minimal clothing,” Grok said in a post on X. “xAI has safeguards, but improvements are ongoing to block such requests entirely.” “As noted, we’ve identified lapses in safeguards and are urgently fixing them CSAM is illegal and prohibited,” Grok said, referring to Child Sexual Abuse Material. Published in Dawn, January 3rd, 2026