By Adam Mosseri
June 8, 2021
It's hard to trust what you don't understand. We want to do a better job of explaining how Instagram works. There are a lot of misconceptions out there, and we recognize that we can do more to help people understand what we do. Today, we’re sharing the first in a series of posts that will shed more light on how Instagram’s technology works and how it impacts the experiences that people have across the app. This first post tries to answer questions like "How does Instagram decide what shows up for me first?”; “Why do some of my posts get more views than others?"; and "How does Instagram decide what to show me in Explore?"
One of the main misconceptions we want to clear up is the existence of “The Algorithm.” Instagram doesn’t have one algorithm that oversees what people do and don’t see on the app. We use a variety of algorithms, classifiers, and processes, each with its own purpose. We want to make the most of your time, and we believe that using technology to personalize your experience is the best way to do that.
When we first launched in 2010, Instagram was a single stream of photos in chronological order. But as more people joined and more was shared, it became impossible for most people to see everything, let alone all the posts they cared about. By 2016, people were missing 70% of all their posts in Feed, including almost half of posts from their close connections. So we developed and introduced a Feed that ranked posts based on what you care about most.
Each part of the app – Feed, Explore, Reels – uses its own algorithm tailored to how people use it. People tend to look for their closest friends in Stories, but they want to discover something entirely new in Explore. We rank things differently in different parts of the app, based on how people use them.
Over the years we’ve learned that Feed and Stories are places where people want to see content from their friends, family, and those they are closest to. With any ranking algorithm, how it works can be broken down into steps.
We start by defining the set of things we plan to rank in the first place. With Feed and with Stories this is relatively simple; it’s all the recent posts shared by the people you follow. There are a few exceptions, like ads, but the vast majority of what you see is shared by those you follow.
Next we take all the information we have about what was posted, the people who made those posts, and your preferences. We call these “signals”, and there are thousands of them. They include everything from what time a post was shared to whether you’re using a phone or the web to how often you like videos. The most important signals across Feed and Stories, roughly in order of importance, are:
From there we make a set of predictions. These are educated guesses at how likely you are to interact with a post in different ways. There are roughly a dozen of these. In Feed, the five interactions we look at most closely are how likely you are to spend a few seconds on a post, comment on it, like it, reshare it, and tap on the profile photo. The more likely you are to take an action, and the more heavily we weigh that action, the higher up you’ll see the post. We add and remove signals and predictions over time, working to get better at surfacing what you’re interested in.
There are a few cases where we try to take other considerations into account. One example of this is where we try to avoid showing too many posts from the same person in a row. Another example is Stories that were “reshared” from Feed: until recently, we valued these Stories less, because we’ve heard consistently that people are more interested in seeing original Stories. But we see a swell of reshared posts in big moments – everything from the World Cup to social unrest – and in these moments people were expecting their Stories to reach more people than they did, so we stopped.
We always want to lean towards letting people express themselves, but when someone posts something that may jeopardize another person's safety, we step in. We have Community Guidelines that apply not only to Feed and Stories, but to all of Instagram. Most of these rules are focused on keeping people safe. If you post something that goes against our Community Guidelines and we find it, we take it down. If this happens repeatedly, we may prevent you from sharing, and eventually we might suspend your account. If you think we’ve made a mistake – and we do make mistakes – you can appeal by following these steps.
Another important case to call out is misinformation. If you post something that third-party fact checkers label as misinformation, we don’t take it down, but we do apply a label and show the post lower in Feed and Stories. If you’ve posted misinformation multiple times, we may make all of your content harder to find.
Explore was designed to help you discover new things. The grid is made up of recommendations – photos and videos that we go out and find for you – which is very different from Feed and Stories, where the vast majority of what you see is from the accounts you follow.
Again, the first step we take is defining a set of posts to rank. To find photos and videos you might be interested in, we look at signals like what posts you've liked, saved, and commented on in the past. Let’s say you’ve recently liked a number of photos from San Francisco’s dumpling chef Cathay Bi (@dumplingclubsf). We then look at who else likes Cathay’s photos, and then what other accounts those people are interested in. Maybe people who like Cathay are also into the SF dim sum spot @dragonbeaux. In that case, the next time you open Explore, we might show you a photo or video from @dragonbeaux. In practice, this means that if you’re interested in dumplings you might see posts about related topics, like gyoza and dim sum, without us necessarily understanding what each post is about.
Once we’ve found a group of photos and videos you might be interested in, we then order them by how interested we think you are in each one, much like how we rank Feed and Stories. The best way to guess how interested you are in something is to predict how likely you are to do something with the post. The most important actions we predict in Explore include likes, saves, and shares. The most important signals we look at, in rough order of importance, are:
You don’t follow the people you see in Explore, which changes the dynamic when you come across something problematic. If a friend you follow shares something offensive and you see that in your Feed, that’s between you and your friend. If you see something offensive in Explore from someone you’ve never heard of, that’s a different situation.
That’s why, in addition to our Community Guidelines, we have rules for what we recommend in places like Explore. We call these our Recommendations Guidelines. These include things like avoiding potentially upsetting or sensitive posts, for example, we aim to not show content that promotes tobacco or vaping use in Explore.
Reels is designed to entertain you. Much like Explore, the majority of what you see is from accounts you don’t follow. So we go through a very similar process where we first source reels we think you might like, and then order them based on how interesting we think they are to you.
With Reels, though, we’re specifically focused on what might entertain you. We survey people and ask whether they find a particular reel entertaining or funny, and learn from the feedback to get better at working out what will entertain people, with an eye towards smaller creators. The most important predictions we make are how likely you are to watch a reel all the way through, like it, say it was entertaining or funny, and go to the audio page (a proxy for whether or not you might be inspired to make your own reel.) The most important signals, roughly in order of importance, are:
The same Recommendation Guidelines that apply to Explore apply to reels. We also avoid recommending reels for other reasons, such as low-resolution or watermarked reels, reels that are muted or contain borders, reels that are majority text, or reels that focus on political issues.
People often accuse us of “shadowbanning” or silencing them. It’s a broad term that people use to describe many different experiences they have on Instagram. We recognize that we haven’t always done enough to explain why we take down content when we do, what is recommendable and what isn’t, and how Instagram works more broadly. As a result, we understand people are inevitably going to come to their own conclusions about why something happened, and that those conclusions may leave people feeling confused or victimized. That’s never our intention, and we’re working hard on improvements here. We also manage millions of reports a day, which means making a mistake on even a small percentage of those reports affects thousands of people.
We also hear that people consider their posts getting fewer likes or comments as a form of “shadowbanning”. We can’t promise you that you’ll consistently reach the same amount of people when you post. The truth is most of your followers won’t see what you share, because most look at less than half of their Feed. But we can be more transparent about why we take things down when we do, work to make fewer mistakes – and fix them quickly when we do – and better explain how our systems work. We’re developing better in-app notifications so people know in the moment why, for instance, their post was taken down, and exploring ways to let people know when what they post goes against our Recommendations Guidelines. We’ll have more to share soon, and we’ll also go more in-depth on these topics in this series.
How you use Instagram heavily influences the things you see and don’t see. You help improve the experience simply by interacting with the profiles and posts you enjoy, but there are a few more explicit things you can do to influence what you see.
Providing more context on how content is ranked, shown, and moderated on Instagram is only part of the equation. There is more we can do to help you to shape your Instagram experience based on what you like. We also need to continue to improve our ranking technology and, of course, make fewer mistakes. Our plan is to be proactive about explaining our work across all three areas from here on out. Stay tuned.