By Adam Mosseri
February 07, 2019
At Instagram, nothing is more important to us than the safety of the people in our community. Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe.
That’s why today, following a comprehensive review with global experts and academics on youth, mental health and suicide prevention, we’re announcing further changes to our approach on self-harm content:
Self-harm and suicide are complex issues and we rely on the input of experts in these fields to help shape our approach. Up until now, we’ve focused most of our approach on trying to help the individual who is sharing their experiences around self-harm. We have allowed content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right.
During the comprehensive reviews, the experts, including the Centre for Mental Health and Save.org reaffirmed that creating safe spaces for young people to talk about their experiences – including self-harm – online, is essential. They advised that sharing this type of content often helps people connect with support and resources that can save lives.
However, collectively it was advised that graphic images of self-harm – even when it is someone admitting their struggles – has the potential to unintentionally promote self-harm. Which is why we are no longer allowing graphic images of self-harm.
Our aim is to have no graphic self-harm or graphic suicide related content on Instagram and to significantly reduce – with the goal of removing – all self-harm and suicide imagery from hashtags, search, the explore tab or as recommended content, while still ensuring we support those using Instagram to connect with communities of support.
We need to create a safe and supportive community for everyone – but this not as simple as just switching off a button. We will not be able to remove these images immediately and we must make sure that people posting self-harm related content do not lose their ability to express themselves and connect with help in their time of need. We will get better and we are committed to finding and removing this content at scale.
We know there’s more that we can do to support the most vulnerable people who use Instagram; that’s why we’ll continue to work with experts and the wider industry to find ways to support people when they’re most in need. You can find out more about our consultation with experts here:
- Adam Mosseri, Head of Instagram