Highlights
- Regulators launch a review of the top 10 mobile games used by children
- Parents report high concern over data collection, exposure to strangers, and harmful content
- Move follows earlier action that prompted major platforms to improve protections for young users
ICO launches review of children’s privacy in leading mobile games
The UK’s Information Commissioner’s Office (ICO) is setting up a monitoring programme to examine how popular mobile games handle children’s data, as regulators turn their attention to a sector used daily by millions of young players.
With around 90 percent of children in the UK playing games on phones and tablets, the ICO says it will review 10 of the most widely used mobile games. The assessment focuses on default privacy settings, geolocation controls, targeted advertising, and any additional data practices identified during the review.
New research from the regulator shows strong parental concern about online safety in gaming. Eighty-four percent of parents worry about potential exposure to strangers or harmful content, with half saying they are “very concerned.” Most also express unease about children sharing personal information (76 percent) and the collection of data for advertising (75 percent). Three in ten parents say their child has stopped using a game over concerns about how information was collected or used.
Intrusive design features trigger scrutiny
John Edwards, the UK Information Commissioner, says early analysis suggests that many mobile games rely on design choices that can be particularly intrusive for children. He notes that the review aims to ensure games adhere to the Children’s Code, the data protection framework that has already prompted major changes across social media and video platforms.
The ICO says the expansion into mobile gaming is the next step in raising privacy standards across services used by young people.
In a sign of growing global momentum around children’s online safety, Australia has announced plans to bar children under 16 from using social media platforms and mandates that platforms take reasonable steps to block underage users or risk fines of up to £24.5 million (AUD 50 million).
Progress already made under the Children’s Code
Since its last update in March 2025, the ICO reports:
- Improvements or confirmed good practice on children’s privacy settings across 10 platforms, including Twitch, Viber, and Hoop. Updates include private accounts by default, just-in-time privacy notices, and reduced visibility of child profiles.
- Direct engagement with Snap and Meta over how Snapchat and Instagram handle children’s geolocation data, particularly through map features.
- Notices of intent to issue monetary penalties to MediaLab (Imgur) and Reddit following investigations into their use of children’s data and age-assurance measures.
- A review of age-assurance systems on 17 platforms popular with young users, with a new monitoring programme planned to drive stronger protections on high-risk services.
The ICO says its previous interventions have already influenced how platforms protect more than three million children, with the potential to improve online privacy for nearly 12 million young users across the UK.














