On Dec. 31, xAI’s Grok was prompted by a user on X to write a "heartfelt apology note" after the chatbot had generated and shared an image of two young girls in sexualized attire based on a user’s prompt. In the apology, the Grok attributed this to a “failure in safeguards.” But in the days since, a growing number of women and girls have been digitally “undressed” by the bot.

Ashley St. Clair, a conservative influencer who shares a child with Elon Musk, was a target of the digital attacks, she wrote. People on X, formerly Twitter, used Grok to generate sexual images of her, including one using a photo of St. Clair at 14 years old. Other users reported that Grok had edited their photos to “put them into a bikini.”

One of those women is Bella Wallersteiner, a U.K.-based content creator, who posted a selfie to X on Dec. 31 to wish her nearly 100,000 followers a happy New Year. She scrolled through the replies, liking tweets that returned her well-wishes. Then, she saw a photo of herself in a "Hello Kitty micro bikini." The photo had been edited and published without her consent, Wallersteiner told USA TODAY on Jan. 6.

This trend is part of a growing problem experts call image-based sexual abuse, in which deepfake nonconsensual intimate imagery (NCII) is used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women.