Deluge of ‘nudified’ images on social media platform X raises questions about regulation of use of AI technologies

The deluge of images of partly clothed women – stripped by the Grok AI tool – on Elon Musk’s X has raised further questions over regulation of the technology. Is it legal to produce these images without the subject’s consent? Should they be taken off X?

In the UK alone there is some doubt over the answers to these queries. Social media regulation is a nascent area, let alone trying to control the deployment of artificial intelligence. There are laws in place to tackle the problem, such as the Online Safety Act, but the government has yet to introduce additional measures such as banning nudifying apps.

It is a criminal offence to share intimate images of someone without their consent under the Sexual Offences Act in England and Wales, which includes images created by AI. The law explains what constitutes an intimate image, including engaging in a “sexual act”, doing a “thing which a reasonable person would consider to be sexual”, and showing a person’s exposed genitals, buttocks or breasts.

This also includes being in underwear or wet or transparent clothing that exposes those body parts. However, according to Clare McGlynn, a professor of law at Durham University and an expert in pornography regulation, “just the prompt ‘bikini’ would not strictly be covered” by the law.