The Washington PostDemocracy Dies in Darkness

Opinion Facebook serves as an echo chamber, especially for conservatives. Blame its algorithm.

By
, 
and 
October 26, 2020 at 7:00 a.m. EDT
(Andrew Harrer/Bloomberg)

Steven L. Johnson, Brent Kitchens and Peter Gray are information technology professors at the University of Virginia McIntire School of Commerce. Their peer-reviewed research referenced in the op-ed is forthcoming as an article titled “Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption” in the academic journal MIS Quarterly.

When Mark Zuckerberg appears at a Senate committee hearing on Wednesday, he will no doubt be asked about Facebook’s content moderation policies. He may face important questions about how Facebook decides when content is not merely misleading, but outright false and harmful. An equally important question lawmakers may not ask — but should — is how Facebook’s algorithms shape the news and information that users consume.

Zuckerberg has recently denied that Facebook is a right-wing echo chamber. Instead, he claimed that the content that gets the most visibility on Facebook is “the same things that people talk about in the mainstream.” But Zuckerberg also admits that “it’s true that partisan content” attracts more likes, comments and shares. Those are precisely the signals that lead Facebook algorithms to push polarizing partisan content into people’s Facebook feeds.

Zuckerberg’s public comments have dodged a crucial question: What effect does Facebook have on the political slant of the news and information its users read? Our peer-reviewed research shows that the more time someone spends on Facebook, the more polarized their online news consumption becomes. What’s more, we find Facebook usage is five times more polarizing for conservatives than for liberals. This evidence suggests Facebook indeed serves as an echo chamber, especially for its conservative users.

Our analysis traces four years of visits made to social media platforms (Facebook, Reddit and Twitter) and online news sites by nearly 200,000 typical U.S. Internet users, using data from an opt-in consumer panel. When we matched up each user’s social media use with their visits to news sites, we found that both Facebook and Reddit connect their users with a wider, more diverse range of news sources than they would typically visit on their own. This is intuitive: One reason people use social media is to discover news.

But when we analyzed the average partisan slant of each user’s news site visits, we found a surprising pattern. Facebook and Reddit shape the news consumption of their conservative users in dramatically different ways. In months when a typical conservative visited Facebook more than usual, they read news that was about 30 percent more conservative than the online news they usually read. In contrast, during months when a typical conservative used Reddit more than usual, they read news that was far less conservative — about 50 percent more moderate than what they typically read.

Why would Facebook lead conservatives to read more polarized news sites and Reddit to more politically moderate ones? The answer may lie in three ways their algorithms differ — namely, how they consider social networks, topical interests and engagement history. First, on Facebook you can only be friends with people who agree to be friends with you, too. This minimizes the chances of seeing content from people with more diverse, opinion-challenging viewpoints. Second, Reddit users express interest in content by joining topic-based communities, the majority of which are focused on nonpartisan topics such as entertainment, hobbies and sports. Finally, while both Reddit and Facebook closely track what content is the most popular, they differ in how they use that data. Reddit prioritizes content based on what users vote to be the most interesting or informative, but Facebook gives priority to what has garnered the most engagement, which can span from positive affirmation to angry disagreement. This can lead to the most intensely passionate, most partisan Facebook users drowning out moderate voices.

The impacts of using a platform are not random; they are a direct outcome of that platform’s design decisions. Social media companies are constantly analyzing and fine-tuning their algorithms and adding new features. The features that facilitate a right-wing echo chamber on Facebook — such as how users connect with one another and how the algorithms work to maximize engagement — are intentional choices.

Other platforms have chosen differently. Twitter chief executive Jack Dorsey has endorsed the importance of transparency in how its algorithms work and is considering enabling users to choose or even create “their own algorithms to rank the content” they see. Because the algorithms that decide what will be shown to users shape their news consumption in such a powerful way, this level of transparency could be an important step in promoting public trust in social media.

Facebook, Reddit and Twitter all have policies about extreme content on their platforms. Yet Facebook’s algorithms push users to become more extreme. As long as increased use of Facebook is associated with polarizing echo chambers, Mark Zuckerberg is failing in his stated mission to “bring the world closer together.”

Watch Opinions videos:

Micro-targeted data intended for advertising on social media is being used to to change power structures globally, says Philippine journalist Maria Ressa. (Video: The Washington Post)

Read more:

Samantha Power: Two things Facebook still needs to do to reduce the spread of misinformation

Jennifer Rubin: Facebook gets pushback. More is needed.

The Post’s View: For this election, at least Facebook is admitting it has a problem

Alexandra Petri: Let’s make social media fair and balanced

Lee C. Bollinger and Donald E. Graham: Trump’s assault on Twitter is an attack on the First Amendment

The Post’s View: Twitter and Facebook were right to suppress a Biden smear. But they should tell us why they did.