Highlights
- Ofcom has ordered Facebook, Instagram, TikTok, YouTube, Snapchat and Roblox to prove by 30 April how they will protect children online.
- Ofcom can fine companies up to 10 per cent of global revenue while the ICO can issue fines of up to 4 per cent of annual global turnover.
- The ICO last month fined Reddit nearly £14.5 m for failing to introduce meaningful age checks.
The ICO separately issued an open letter to the same platforms calling on them to adopt "modern, viable" age-assurance tools to stop under-13s accessing services not designed for them.
ICO chief executive Paul Arnold stated "There's now modern technology at your fingertips, so there is no excuse." Both regulators said they had grown increasingly concerned about algorithmic feeds exposing children to harmful or addictive content.
Platforms push back
Meta said it already uses AI-based age detection tools and places teenagers in accounts with built-in protections, adding that age should be verified "centrally at the app store level."
YouTube said it was "surprised to see Ofcom move away from a risk-based approach" and urged the regulator to focus on "high-risk services" failing to comply with the law.
Roblox said it had launched more than 140 new safety features in the past year including mandatory age checks for chat. Snapchat did not respond and TikTok declined to comment.
Ofcom can fine companies up to 10 per cent of their qualifying global revenue for non-compliance, while the ICO can issue fines of up to 4 per cent of a company's global annual turnover.
The ICO last month fined Reddit nearly £14.5 m for failing to introduce meaningful age checks and for processing children's data unlawfully, sending a clear signal that regulators are prepared to act against platforms that fall short of their obligations.




