Suicide and self-harm content is still recommended to teenagers ‘at industrial scale’ by TikTok and Instagram eight years after Molly Russell’s tragic death, according to new research.

The Molly Rose Foundation found that social media algorithms are ‘putting young lives at risk’ as they recommend depression, suicide and self-harm content to accounts opened as a 15-year-old-girl.

The foundation - set up by Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content online - is calling for the Prime Minister to introduce tougher measures to stop ‘preventable harm happening on his watch’.

The report, Pervasive-by-design, was conducted in the weeks leading up to the implementation of the Online Safety Act and claims that social media platforms are ‘gaming Ofcom’s new rules’.

Molly Rose said its research found the material recommended by both TikTok and Instagram now ‘would have the same harmful impact as content which Molly Russell saw before her death in 2017’.