Campaigners welcome criminalisation of non-consensual AI-generated explicit images but say law does not go far enough

Victims of deepfake image abuse have called for stronger protection against AI-generated explicit images, as the law criminalising the creation of non-consensual intimate images comes into effect.

Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with more than 73,000 signatures, urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices.

“Today’s a really momentous day,” said Jodie, a victim of deepfake abuse who uses a pseudonym.

“We’re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it,” she added.