Facebook Echo Chamber Isn't Facebook's Fault, Says Facebook

A peer-reviewed study conducted by Facebook found that its own algorithms aren't to blame for sparing us opinions we don't like. It's us.

Does the internet help facilitate an echo chamber? In an age where so much of the information that we see online is filtered through opaque algorithms, the fear is that we only get exposed to viewpoints with which we already agree. Behemoths like Facebook and Google show you new stuff based on what you've previously liked, the argument goes. And so we get stuck in a polarizing cycle that stifles discourse. We only see what we want to see.

But in a new peer-reviewed study published today in Science, Facebook data scientists have for the first time tried to quantify how much the social network’s formula for its News Feed isolates its users from divergent opinions. According to their findings, Facebook’s own algorithms aren't to blame. It's us.

For six months starting in July, Facebook researchers reviewed the the anonymous information of 10.1 million US accounts. Researchers found that users’ network of friends and the stories they see do reflect their ideological preferences. But the study found that people were still exposed to differing points of view. According to researchers, Facebook’s algorithm suppresses opposing opinions only about 8 percent of the time for liberals and 5 percent for conservatives. Meanwhile, the researchers say, a user’s click behavior---their personal choices---results in 6 percent less exposure to diverse content for liberals and 17 percent less exposure to diverse content for conservatives.

The position of a link on one’s News Feed---determined by the algorithm’s interpretation of your preferences---also has an effect on the likelihood that a user will click on it, the study found.

Only Yourself to Blame

In other words, the researchers claim that Facebook’s algorithm suppresses opposing content at a lower rate than a user’s own choices.

But as some critics have pointed out, there’s a flawed logic to pitting user choice against Facebook’s algorithm. After all, users can only choose to click on something that the algorithm has already filtered for them. Because what a user chooses to click on feeds into the algorithm, it seems wrong to weight the two factors equally. A user choosing not to click on a certain piece of content is not the equivalent of an algorithm hiding it from the user.

Notably, the sample of users Facebook studied was also limited to individuals who chose to self-identify as either liberal or conservative in their profiles. That’s a mere 9 percent of the network’s 1.4 billion users. It stands to reason that those who don’t self-report their political leanings could behave very differently from those who do.

This is not the first time Facebook has conducted research on its users. In a study quietly conducted last year, Facebook altered the number of positive and negative posts that some users saw to determine the effects it had on people’s moods. The idea that Facebook was deliberately trying to manipulate people's moods made a lot of users upset. But if Facebook's new numbers are correct, we're the ones who need to drag ourselves out of our own echo chambers. If Facebook is driving us apart, we're going along for the ride.