In response to global regulatory pressures, Instagram and Facebook have announced plans to hide more content from teenage users to protect them from harmful material on the apps. Meta, the parent company, stated in a recent blog post that all teen users would be subject to the most restrictive content control settings.
Additionally, the availability of certain search terms on Instagram will be further limited. This initiative is designed to make it harder for teens to encounter sensitive content, such as suicide, self-harm, and eating disorders, especially when using features like Search and Explore on Instagram.
Meta has indicated that these measures, expected to be implemented over the coming weeks, aim to provide a more “age-appropriate” experience for young users. The company’s move comes amid heightened scrutiny from regulators in the United States and Europe over concerns that its apps contribute to addiction and a youth mental health crisis.
Regulatory Challenges and Response to Allegations
In the United States, attorneys general from 33 states, including California and New York, sued Meta in October, accusing it of misleading the public about the dangers of its platforms to young users. The European Commission has also sought information on how Meta safeguards children from illegal and harmful content. This increased regulatory focus followed testimony by a former Meta employee in the U.S. Senate, alleging the company’s awareness and inaction regarding harassment and harm faced by teens on its platforms.
Arturo Bejar, the former Meta employee, criticized the company’s latest changes, stating they did not address his concerns about its approach to harm. He argued that Meta still lacks effective mechanisms for teens to report unwanted advances and relies on inadequate definitions of harm. Bejar emphasized the need for a more transparent discussion about the actual experiences of harm by teens.
Children and teens have long been a target demographic for brands advertising on Facebook and Instagram to build brand loyalty at an impressionable age. Meta has faced intense competition from TikTok for young users, as Facebook’s popularity among teens has declined. According to a 2023 Pew Research Center survey, while 63% and 59% of U.S. teens use TikTok and Instagram, respectively, only 33% reported using Facebook.
Meta’s new measures to protect teens on Instagram and Facebook represent an effort to address growing concerns about the safety of young users on social media. Amidst regulatory challenges and increasing competition, Meta is taking steps to offer a safer and more age-appropriate online experience for teenagers.