Instagram Rolls Out PG-13 Content Filters for Teen Accounts by Default
Instagram is adding stricter safety rules to protect users under 18. The social network will now automatically place teens into a PG-13 mode, which filters out content with strong language, drug references, risky stunts, or mature themes.
Meta says teens cannot change these settings without parental consent. The platform will also restrict interactions with accounts that frequently post adult content. Teens already following such accounts will lose access.
These changes build on Instagram’s existing safeguards: teen accounts are private by default, block certain sensitive topics, and limit unknown direct messages. But critics question if they are enough.
Meta plans to launch a “limited content” setting that gives parents even more control. The company will also use AI and age-prediction tools to detect users misreporting their age to bypass safeguards.
Instagram says the update is its most significant teen account overhaul since introducing dedicated accounts. The new filters will roll out first in the U.S., U.K., Canada, and Australia, then globally by year-end.
Advocates and researchers welcome the move but call for transparency. Some warn that past safety tools failed in practice. Success will depend on how well Instagram enforces these rules.