Skip to content
Search

Latest Stories

UK watchdogs give social media giants April deadline to improve child safety

Regulators give platforms until 30 April to tighten age checks and restrict strangers from contacting children under the Online Safety Act

UK watchdogs give social media giants April deadline to improve child safety

Ofcom can fine companies up to 10 per cent of their qualifying global revenue for non-compliance

iStock

Highlights

  • Ofcom has ordered Facebook, Instagram, TikTok, YouTube, Snapchat and Roblox to prove by 30 April how they will protect children online.
  • Ofcom can fine companies up to 10 per cent of global revenue while the ICO can issue fines of up to 4 per cent of annual global turnover.
  • The ICO last month fined Reddit nearly £14.5 m for failing to introduce meaningful age checks.
Ofcom and the Information Commissioner's Office have warned major social media platforms including Facebook, Instagram, TikTok, YouTube, Snapchat and Roblox to strengthen child safety measures .
In the latest implementation phase of Britain's Online Safety Act, Ofcom told the platforms to show by 30 April how they would tighten age checks, restrict strangers from contacting children, make algorithmic feeds safer and stop testing new products on minors.
Ofcom chief executive Melanie Dawes told Reuters "These online services are household names, but they're failing to put children's safety at the heart of their products. That must now change quickly, or Ofcom will act."

The ICO separately issued an open letter to the same platforms calling on them to adopt "modern, viable" age-assurance tools to stop under-13s accessing services not designed for them.

ICO chief executive Paul Arnold stated "There's now modern technology at your fingertips, so there is no excuse." Both regulators said they had grown increasingly concerned about algorithmic feeds exposing children to harmful or addictive content.


Platforms push back

Meta said it already uses AI-based age detection tools and places teenagers in accounts with built-in protections, adding that age should be verified "centrally at the app store level."

YouTube said it was "surprised to see Ofcom move away from a risk-based approach" and urged the regulator to focus on "high-risk services" failing to comply with the law.

Roblox said it had launched more than 140 new safety features in the past year including mandatory age checks for chat. Snapchat did not respond and TikTok declined to comment.

Ofcom can fine companies up to 10 per cent of their qualifying global revenue for non-compliance, while the ICO can issue fines of up to 4 per cent of a company's global annual turnover.

The ICO last month fined Reddit nearly £14.5 m for failing to introduce meaningful age checks and for processing children's data unlawfully, sending a clear signal that regulators are prepared to act against platforms that fall short of their obligations.

More For You