Democrats and Republicans consume very different news content on Facebook, researcher says

facebook logo on blue and black background
Photo by Matthew Modoono/Northeastern University

A Northeastern University professor examining political news consumption on Facebook during the 2020 election season says liberals and conservatives get very different news, and much more exclusively conservative content exists than content tailored to liberals.

Perhaps most striking in the research, 97% of the items labeled as misinformation came from the political right.

“This tells us that news consumption on Facebook is highly segregated, meaning that Democrats and Republicans consume very different content and that this is substantially driven by pages and groups rather than user-to-user context,” says David Lazer, a University Distinguished Professor of Political Science and Computer Sciences at Northeastern University. “It also shows that of the content labeled misinformation, far more of it had conservative audiences rather than liberal audiences.”

Northeastern Global News, in your inbox.

Sign up for NGN’s daily newsletter for news, discovery and analysis from around the world.

Lazer was a co-lead on a paper that analyzed exposure to political news during the 2020 election, using aggregated data for 208 million adult Facebook users in the U.S. The research used Facebook’s classifier to predict the user’s political ideology as well as the platform’s classifier to identify political or civic content. 

The study is among three papers coming out in this month’s edition of the journal Science and a paper in the journal Nature that were recently completed using data from Meta, Facebook’s parent company. Roughly another dozen papers also analyzing the data are in process.

The research focused on political or civic content between Sept. 1, 2020, and Feb. 1, 2021, that contained a link to further political news and that was shared more than 100 times on the platform. It did not contain statements, images and videos about politics. 

Researchers also distinguished among posts from friends and on pages or in groups. This enabled them to measure the content liberals and conservatives could potentially see (you don’t see every single post of every single friend in your feed, Lazer notes), what content liberals and conservatives did see from their friends, and what content liberals and conservatives interacted with through reactions, comments and shares.

Using a scale of 0 to 100 to evaluate how isolated liberals and conservatives are from each other in their news consumption—a score of 0 meant that both liberals and conservatives saw the same news, while a score of 100 indicates that liberals and conservatives received completely different news—Lazer and company made several key findings.

First, they found that liberals and conservatives are very segregated in terms of their source of news. 

Moreover, that segregation is “asymmetrical,” meaning that, as Lazer says, “there are a lot more neighborhoods that are exclusively Republican or conservative than are Democratic or liberal.”

Thirdly, news from Pages and Groups on Facebook is much more segregated by political ideology compared with Friend-generated content.

“Upon reflection, it makes a lot of sense,” Lazer says. “Friends you may disagree with politically, but they are still your friends. But with pages there’s no such issue. Pages align with preferences, and people will choose pages that are mostly liberal or more conservative, and the same is true with groups.”

And finally, the headline-generating finding: of all the links judged to be misinformation by Facebook’s third-party fact-checking program, 97% of it was for a conservative audience. 

Lazer says the findings may seem confirmatory with all the talk of a self-selecting and/or algorithmic echo chamber. However, much of the prior research suggested that online news consumption was still fairly diverse and not more segregated than an offline news diet.

“The narrative challenge in matters of news is that a lot of reporters may think we knew this, but we didn’t really know this,” Lazer says. “In fact, much of the existing research pointed in the opposite direction.”

Lazer explains the disparate findings by noting that prior studies mostly analyzed browsing behavior rather than social-media behavior—in other words, analyzing what happens when you directly visit a website versus when you use Facebook. 

As for the future, Lazer notes that 2024 is close at hand.

“I’m game,” Lazer says. “Meta deserves credit for opening their doors to external research. I hope they and other companies do this in 2024.”

Cyrus Moulton is a Northeastern Global News reporter. Email him at Follow him on Twitter@MoultonCyrus.