UK Regulator Ofcom Launches Investigations into Telegram, X Over Child Safety Concerns
AI-generated from multiple sources. Verify before acting on this reporting.
LONDON — The UK communications regulator Ofcom launched formal investigations on Monday into Telegram, X, and two teen-focused chat platforms over allegations of failing to prevent the sharing of illegal content and protect minors.
The regulatory body announced the probes into the messaging service Telegram, the social media platform X, and the websites Teen Chat and Chat Avenue. The investigations stem from concerns regarding the distribution of child sexual abuse material (CSAM) on Telegram, child grooming activities on the teen chat sites, and the circulation of nonconsensual sexually explicit content on X.
Ofcom stated that the inquiries will assess whether the companies have complied with their legal obligations under the Online Safety Act. The regulator indicated that the platforms face scrutiny over their content moderation systems and the effectiveness of their safety measures for users in the United Kingdom.
Telegram has faced increasing pressure from international authorities regarding the use of its encrypted channels for illicit activities. The regulator's announcement marks a significant escalation in the UK government's efforts to enforce stricter digital safety standards. The probe into X follows a series of complaints regarding the platform's handling of harmful content, including images and videos shared without consent.
Teen Chat and Chat Avenue are targeted for alleged failures to prevent adults from contacting minors for sexual purposes. Ofcom's investigation will examine the platforms' age verification processes and the mechanisms in place to detect and remove grooming behavior.
The companies have not yet issued public statements regarding the specific allegations. Telegram has historically resisted calls for backdoor access to encrypted messages, citing user privacy. X has previously argued that its content moderation policies align with free speech principles, though it has faced criticism for inconsistent enforcement.
The Online Safety Act requires platforms to take proactive steps to protect users from illegal content and harmful material. Failure to comply can result in substantial fines and potential restrictions on access to the UK market. Ofcom has the authority to impose penalties of up to 10% of global turnover or £18 million, whichever is higher, for serious breaches.
The investigations are expected to take several months to complete. Ofcom will gather evidence from the platforms and may interview company executives. The regulator has indicated that interim measures could be implemented if immediate risks to children are identified.
Industry analysts suggest the probes could set a precedent for how regulators handle global tech giants operating in the UK. The outcome may influence similar actions by other national authorities. Questions remain regarding the technical feasibility of the safety measures required and the balance between privacy and security.
The regulator has not specified a timeline for the conclusion of the investigations. Ofcom will continue to monitor the platforms' compliance with safety obligations while the probes proceed. The companies must cooperate with the inquiry to avoid further regulatory action.