LONDON, April 21 (Reuters) – Ofcom, the UK communications regulatory agency, opened an investigation on Tuesday into the messaging app Telegram, after evidence suggested that child sexual abuse material was being shared on the platform.
The investigation is part of the UK’s efforts to curb children’s exposure to harmful content on the internet without clear accountability. While the country’s 2023 Online Safety Act set stricter standards for social media platforms like Facebook, YouTube and TikTok, Prime Minister Keir Starmer wants them to go further.
The government has been consulting on a possible social media ban for children under 16, and Starmer met last week with executives from social media companies where he urged them to take more responsibility.
Continues after advertising
Ofcom said it had received evidence from the Canadian Center for Child Protection about the alleged presence and sharing of child sexual abuse material on Telegram and had carried out its own assessment of the platform.
“In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to meet its obligations in relation to illegal content,” Ofcom said in a statement.
Telegram said it ‘categorically’ denied Ofcom’s allegations, adding that since 2018 it had ‘virtually eliminated’ the public dissemination of child sexual abuse material on its platform through detection algorithms.
“We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of expression and the right to privacy,” the Dubai-based company said in a statement.
Telegram was fined in February by Australia’s online safety regulator for being slow to respond to questions about measures taken to prevent the spread of child abuse and violent extremist material.
The UK’s Ofcom said on Tuesday it had also opened investigations into Teen Chat and Chat Avenue to see whether these platforms were meeting their obligations to protect children from the risk of being targeted by grooming.
Continues after advertising
Ofcom said that, after speaking with the companies, it remained dissatisfied with the issue of whether they were offering adequate protection to British children against the risk of grooming.
“These companies need to do more to protect children, or they will face serious consequences under the Online Safety Act,” Suzanne Cater, Ofcom’s director of enforcement, said in the statement.
(Reporting by Muvija M; editing by Paul Sandle and Susan Fenton)