Takeaways:
We want teens to have safe, age-appropriate experiences on our apps. We’ve developed more than 30 tools and resources to support teens and their parents, and we’ve spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive. Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.
We regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens.
Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people. Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content. We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow.
— Dr. Rachel Rodgers, Associate Professor, Department of Applied Psychology, Northeastern University
We want people to find support if they need it, so we will continue to share resources from expert organizations like the National Alliance on Mental Illness when someone posts content related to their struggles with self-harm or eating disorders. We’re starting to roll these changes out to teens under 18 now and they’ll be fully in place on Instagram and Facebook in the coming months.
Here’s more detail on how today’s updates expand on our existing protections, in line with feedback from experts:
- Vicki Shotbolt, CEO, ParentZone.org
We’re automatically placing teens into the most restrictive content control setting on Instagram and Facebook. We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these apps. Our content recommendation controls -- known as “Sensitive Content Control” on Instagram and “Reduce” on Facebook -– make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore.
While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find. Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help. We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.
To help make sure teens are regularly checking their safety and privacy settings on Instagram, and are aware of the more private settings available, we’re sending new notifications encouraging them to update their settings to a more private experience with a single tap. If teens choose to “Turn on recommended settings”, we will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes. We’ll also ensure only their followers can message them and help hide offensive comments.
Learn more about Instagram teen privacy and safety settings here and Meta’s teen privacy and safety settings here.
RELATED ARTICLES