UK Regulator Takes Action Against Messaging and Chat Platforms
The United Kingdom's online safety watchdog, Ofcom, announced on Tuesday that it has opened a formal investigation into Telegram following evidence that the messaging platform has allegedly been used to share child sexual abuse material (CSAM). Alongside the Telegram probe, Ofcom also announced separate investigations into two teen-focused chat services — Chat Avenue and Teen Chat — over concerns about child grooming and exposure to harmful content.
How the Telegram Investigation Began
The inquiry into Telegram was triggered after the Canadian Centre for Child Protection shared information with Ofcom that allegedly demonstrated CSAM is being distributed on the platform. Following receipt of that evidence, Ofcom conducted its own independent assessment of the service and concluded that a formal investigation was warranted.
The central question of the Telegram probe is whether the company has violated the UK's Online Safety Act. That legislation requires providers of what Ofcom classifies as user-to-user services to actively monitor and address risks associated with child sexual abuse and exploitation on their platforms.
Telegram Pushes Back
Telegram responded swiftly and forcefully to the announcement. A company spokesperson stated that the platform has "virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with NGOs."
The spokesperson went further, saying: "Telegram categorically denies Ofcom's accusations. We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy."
The company's framing of the investigation as a potential threat to civil liberties is notable and suggests Telegram may mount a robust legal and public relations defense as the inquiry proceeds.
Teen Chat Sites Also Under Scrutiny
In addition to the Telegram probe, Ofcom announced investigations into Chat Avenue and Teen Chat, focusing on whether these services are doing enough to prevent children from being targeted by predators. Both platforms feature open chatrooms and provide users with private messaging and media sharing capabilities — features that Ofcom and child protection partners believe can be exploited by those seeking to groom minors.
The Chat Avenue investigation carries an additional dimension: Ofcom is also examining whether the platform is taking adequate steps to prevent children from accessing harmful content such as pornography.
Ofcom said it decided to investigate the chat sites based on information received from child protection agencies it works with to identify services that may be facilitating grooming. Crucially, the regulator noted it had already engaged in discussions with both platforms about their safety practices and remained unsatisfied with the responses it received.
How the Platforms Responded
Teen Chat did not immediately respond to requests for comment. A spokesperson for Chat Avenue issued a statement asserting the company's commitment to child safety:
"We are committed to maintaining a safe environment and have implemented a range of child safety measures, with ongoing efforts to further strengthen these protections. We do not agree that grooming is prevalent on our platform. Automatic message deletion, active word filters, active monitoring and user reporting tools are all designed to reduce the risk of harmful behavior and make the site a less suitable environment for predators."
What Ofcom's Enforcement Powers Mean in Practice
Ofcom's Director of Enforcement, Suzanne Cater, underscored the gravity of the situation in a statement accompanying the announcement:
"Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities. It's why we work so closely with partners in law enforcement and child protection organizations to identify where these harms are occurring and hold providers to account where they're failing to meet their obligations."
Should Ofcom's investigations conclude that any of the three platforms have breached the Online Safety Act, the regulator will issue a provisional decision. The companies involved will then have an opportunity to respond before a final determination is reached.
The financial stakes are significant. Under the Online Safety Act, Ofcom has authority to compel platforms to make operational changes and can impose fines of up to £18 million ($24.3 million) or 10% of qualifying worldwide revenue, whichever is greater — a penalty scale designed to make non-compliance genuinely costly even for large global operators.
A Broader Regulatory Moment
These investigations represent some of the most high-profile enforcement actions taken under the Online Safety Act since its passage, and they signal that Ofcom is prepared to move aggressively against platforms — large and small — that it believes are falling short of their legal obligations to protect children online. The outcome of the Telegram investigation in particular is likely to attract international attention, given the platform's global reach and its vocal stance on privacy and free expression.
Source: The Record