Highlights:
- Teen users on Instagram will now see content similar to PG-13 movie ratings
- Search results and interactions with age-inappropriate accounts will be limited
- Parents can enable stricter “Limited Content” settings for additional control
New content protections for teen users
Meta, Instagram’s parent company, announced on Tuesday that accounts for users under 18 will now default to content equivalent to a PG-13 rating. The update, aimed at ensuring age-appropriate experiences, will be rolled out immediately and fully implemented by the end of 2025.
The move is intended to reassure parents while giving teens some control over their own experience. Meta stated that the settings aim to show teens safe content while maintaining an engaging social platform.
How the restrictions work
Teen accounts will be automatically placed under a 13+ setting. To opt out, parental permission is required.
Under the new rules:
- Teens will be blocked from search results containing terms such as alcohol or gore. These restrictions add to existing safeguards covering suicide, self-harm, and eating disorders.
- Teens will no longer be able to follow accounts that regularly post age-inappropriate content. If they already follow such accounts, they will be prevented from interacting with posts, sending messages, or viewing comments.
- Instagram’s integrated tools, including its question-answer features, will also follow PG-13 standards, ensuring responses remain age-appropriate.
Optional parental controls
Meta is also introducing a “Limited Content” option for parents who want an even stricter experience for their teens. This setting filters more content from feeds and restricts the ability to see, leave, or receive comments.
The new teen account protections follow the launch of Instagram’s “Teen Accounts” last year and are part of a broader trend of tech companies, including YouTube and OpenAI, introducing measures to protect younger users online.














