Instagram is updating its platform to enhance safety for teenage users by introducing more robust “built-in protections ” and expanded controls for parents to monitor their children’s activity. These new “teen accounts” will be launched in the UK, US, Canada, and Australia. These accounts will automatically activate several privacy features for users under 18. For instance, their content will be hidden from non-followers, and all new follower requests will require approval.
However, users aged 13 to 15 can only change these settings after adding a parent or guardian to their account. This change comes as social media platforms face mounting global pressure to boost safety measures to shield young users from harmful content.
Rani Govender, NSPCC’s online child safety policy manager, emphasized that such efforts should be complemented by proactive measures to prevent harmful content and sexual abuse from spreading on Instagram. Meta has described the updates as a “new experience for teens, guided by parents,” asserting that the changes will better support parents and ensure teenagers have the necessary protections.
Ian Russell, who lost his 14-year-old daughter Molly after she was exposed to self-harm and suicide content on Instagram, stressed the importance of effectively implementing these policies.
The new settings will be private by default, require teenagers to approve new followers manually, include stringent controls on sensitive content, and mute notifications during night hours.