UK Regulator Investigates Telegram and Teen Chat Sites Over Child Safety Concerns
Ofcom has opened formal investigations into Telegram, Teen Chat, and Chat Avenue under the UK’s Online Safety Act. The regulator is looking at whether Telegram has done enough to stop child sexual abuse material from being shared, and whether the two chat sites have proper systems to protect children from grooming risks.
The Telegram investigation follows evidence from the Canadian Centre for Child Protection, along with Ofcom’s own assessment of the platform. Ofcom says it will examine whether Telegram has met its duties to prevent users from encountering illegal content, limit how long such material remains online, and reduce the risk of the service being used to facilitate offences linked to child sexual abuse material.
Access content across the globe at the highest speed rate.
70% of our readers choose Private Internet Access
70% of our readers choose ExpressVPN
Browse the web from multiple devices with industry-standard security protocols.
Faster dedicated servers for specific actions (currently at summer discounts)
Telegram rejected the concerns and said it has “virtually eliminated” the public spread of child sexual abuse material on its platform since 2018. The company also said it was surprised by the investigation and suggested the probe may be part of a wider attack on platforms that defend privacy and free speech.
Ofcom is testing the Online Safety Act in a major enforcement push
The investigations show how Ofcom is using its new powers under the Online Safety Act. The law places duties on user-to-user platforms and search services to assess risks, reduce illegal harm, and act quickly when illegal material appears.
For Telegram, Ofcom is focusing on illegal content safety duties. These include preventing users from encountering priority illegal content, reducing the risk that the service supports priority offences, and removing illegal material quickly once a platform becomes aware of it.
For Teen Chat and Chat Avenue, the regulator is looking at risk assessments, safety systems, reporting tools, moderation, and measures designed to protect child users from illegal harm. Ofcom says both investigations remain open.
What Ofcom is investigating
| Platform | Main concern | What Ofcom is checking |
|---|---|---|
| Telegram | Alleged sharing of child sexual abuse material | Whether it meets illegal content safety duties |
| Teen Chat | Risk of child grooming | Whether it completed proper risk assessments and safety measures |
| Chat Avenue | Risk of child grooming | Whether it protects users from illegal content and harmful contact |
| X | Grok-related sexualized imagery | Whether X complied with illegal content duties under the Online Safety Act |
The Telegram case stands out because the platform has long positioned itself as a privacy-focused messaging service. That makes moderation questions more complicated, especially when regulators ask platforms to prevent illegal material while users expect private communication tools.
Ofcom’s investigation does not automatically mean Telegram or the chat sites broke the law. It means the regulator believes there is enough evidence to examine whether the services failed to meet their legal duties.
If Ofcom finds failures, it can demand changes, impose fines, or seek stronger measures in serious cases. The regulator can fine companies up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.
Teen chat sites face separate grooming-risk probes
Ofcom also opened investigations into Teen Chat and Chat Avenue. These cases focus on whether the services have taken suitable steps to identify and reduce the risk that predators could use the platforms to contact or groom children.
The Teen Chat investigation covers whether the service completed a suitable illegal content risk assessment, especially for child sexual exploitation and abuse risks. Ofcom will also examine whether the platform has moderation, reporting, and protection systems for child users.
The Chat Avenue investigation follows a similar path. Ofcom says it will review illegal content risk assessments, children’s access assessments, children’s risk assessments, and compliance with duties that apply to regulated user-to-user services.
Why this matters for platforms and users
The UK’s Online Safety Act shifts more responsibility onto digital platforms. Services cannot rely only on user reports after harm occurs. They must assess risks in advance, build safety systems, and prove that they can reduce illegal harm.
This matters most for services used by children or platforms where strangers can contact each other. Open chat rooms, public groups, private messages, and weak identity checks can create serious safety risks when moderation falls short.
The Telegram investigation also raises broader questions about encrypted and privacy-first services. Regulators want stronger action against illegal material, while platforms argue that user privacy and freedom of expression must remain protected.
Ofcom is also investigating X over Grok content
The Telegram and teen chat cases are not Ofcom’s only Online Safety Act investigations. The regulator also opened a formal investigation into X over concerns that the Grok AI chatbot account was used to create and share sexualized deepfake images of real people, including children.
Ofcom said it acted after reports that Grok was used to generate demeaning sexual deepfakes. The regulator is examining whether X complied with its duties to protect UK users from illegal content.
Together, the investigations show that Ofcom is targeting both older platform risks and newer AI-driven harms. The same legal framework now applies across messaging apps, chat sites, social platforms, and AI-enabled content tools.
What companies may need to show
Under the Online Safety Act, platforms may need to prove that they understand the risks on their services and have working systems to reduce them.
That can include:
- Clear illegal content risk assessments
- Children’s access assessments where relevant
- Child safety risk assessments
- Effective moderation systems
- Reporting and complaint tools
- Fast removal processes for illegal material
- Measures to stop repeat misuse
- Records showing how safety decisions were made
These duties matter because Ofcom can ask platforms to explain their systems and provide evidence. A platform that has written policies but weak enforcement may still face scrutiny.
The regulator can also seek court orders in the most serious cases. Ofcom says those orders could require third parties, including payment providers, advertising services, or internet service providers, to disrupt or block access to a non-compliant service in the UK.
FAQ
Ofcom is investigating whether Telegram has complied with its duties under the UK’s Online Safety Act to prevent child sexual abuse material from being shared on the platform.
Ofcom is investigating whether the two chat services have properly assessed and reduced grooming risks and whether they have suitable systems to protect child users from illegal harm.
Yes. Telegram said it has virtually eliminated the public spread of child sexual abuse material since 2018 and raised concerns that the investigation could affect privacy and free speech.
Ofcom can impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher. In serious cases, it can also seek court orders that may disrupt or block a service in the UK.
Read our disclosure page to find out how can you help VPNCentral sustain the editorial team Read more
User forum
0 messages