The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.
Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.
Under the current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk huge fines for breaches.
Telegram said in a statement that it "categorically denies Ofcom's accusations".
"Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations]," it told the BBC.






