Instagram alerts parents on teen suicide searches under a new safety policy aimed at protecting young users. The update applies to supervised accounts in the United States, the United Kingdom, Australia, and Canada.
According to the platform, parents will receive notifications if their teenager repeatedly searches for terms related to suicide or self-harm within a short period. However, these alerts will only be sent to parents who have enabled Instagram’s optional supervision settings.
The move builds on existing safeguards. Instagram blocks certain harmful search terms and redirects users to mental health support resources.
With the new update, parents gain additional visibility into potentially concerning online behaviour. The company said it maintains strict policies against content that promotes or glorifies suicide or self-harm. The rollout is expected to begin next week in the four listed countries.
The announcement comes amid increasing government pressure on social media platforms to strengthen youth protection measures. Australia recently introduced a ban on social media use for under-16s. Britain has also explored tighter online safety regulations, while Spain, Greece, and Slovenia are discussing similar steps.
Read: Social Media Addiction Lawsuit Targets Instagram and YouTube in California Trial
By introducing parental alerts, Instagram aims to expand its supervision tools and reinforce its broader child safety framework.