A substantial number of AI images generated or edited with Grok are targeting women in religious and cultural clothing.

A WIRED review of outputs hosted on Grok’s official website shows it’s being used to create violent sexual images and videos, as well as content that includes apparent minors.

Expert explains how simple it could be to tweak Grok to block CSAM outputs.

AI tool also used to undress image of the woman killed by a federal immigration agent in the US, says research

A substantial number of AI images generated or edited with Grok are targeting women in religious and cultural clothing.

Grok tests if UK can penalize platforms for sexualized deepfakes generated by AI.