The use of ‘nudify’ apps is becoming more and more prevalent, with hundreds of teachers having seen images created by pupils, often of their peers. The fallout is huge – and growing fast
‘It worries me that it’s so normalised. He obviously wasn’t hiding it. He didn’t feel this was something he shouldn’t be doing. It was in the open and people saw it. That’s what was quite shocking.”
A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually pulled out his phone, selected a picture from social media of a girl at a neighbouring school and used a “nudifying” app to doctor her image.
Ten years ago it was sexting and nudes causing havoc in classrooms. Today, advances in artificial intelligence (AI) have made it child’s play to generate deepfake nude images or videos, featuring what appear to be your friends, your classmates, even your teachers. This may involve removing clothes, getting an image to move suggestively or pasting someone’s head on to a pornographic image.
The headteacher does not know why this particular girl – a student at her school – was selected, whether the boy knew her, or whether it was completely random. It only came to her attention because he was spotted by another of her pupils who realised what was happening and reported it to the school.






