Northeastern University researchers among the first to gain access to Facebook data that could help us understand how fake news spreads on social media

Northeastern researchers are working to determine how fake news ends up in the news feeds of Facebook users. Photo by Adam Glanzman/Northeastern University

The persistence of fake news and misinformation is creating chaos across the globe. It’s affecting elections in the United States and India, and fueling protests in Sudan.

But how does fake news spread? One common theory posits that fake news spreads from person to person on social media, which prevents reputable media organizations from vetting the accuracy of the information before it’s made public.

Now, for the first time, researchers at Northeastern University will be able to test the theory.

Four Northeastern professors are among the first group of researchers who will be given access to closely guarded Facebook data—data that could reveal key information about the way people share fake news on social media.

They’ll have access to three datasets from Facebook. The first dataset will include information from public accounts on Facebook and Instagram (a social media platform owned by Facebook) that will enable the researchers to track the popularity of news items across the two platforms. The second set will include data on political advertisements that ran on Facebook in the U.S., U.K., Brazil, India, Ukraine, Israel, and the EU. And the third set will include information about specific URLs that have been shared by at least 100 unique Facebook users.

The researchers will use this Facebook data to build a map that will trace back fake news posts to their origins.

“It’s very exciting,” says Nick Beauchamp, an assistant professor of political science at Northeastern, who’s leading the project for the university. “Facebook is the 800-pound gorilla, and the opportunity to work with their data in a way that’s ethical and secure is an exciting one.”

Beauchamp is working with an interdisciplinary group of researchers in the fields of political science, economics, and computer science—including Northeastern faculty members David Lazer, Donghee Jo, and Lu Wang, as well as Kenneth Joseph, an assistant professor at State University of New York at Buffalo.

They’re trying to figure out how fake news ends up in the news feeds of Facebook users, because, they say, the answer will provide important insight into the fake news phenomenon.

There are generally two ways it happens, Beauchamp says: Either media companies publish fake news stories on their own Facebook accounts, which are displayed in front of users, or users share fake news stories with their online social circles.

The researchers might discover that fake, misleading, or ideologically extreme news from established media companies is being pushed onto people’s social media feeds—either by media companies themselves or via Facebook’s algorithm—which would suggest that these companies are at the root of the problem, Beauchamp says.

Or they could find that fake news spreads when friends share it on social media, which would provide compelling evidence that reputable media organizations are no longer at the helm of what constitutes news, Beauchamp says.

“We know there’s a problem with fake news,” Beauchamp says. “What we don’t know is whether it’s a problem of the institutions and the moment in which we’re living, or if it’s a problem that evolved from peer-sharing.”

In 2018, Facebook made a major change to its news feed, shifting the focus to prioritize posts from friends rather than media companies. Users now see more of their friends’ posts, and fewer posts from news publishers and businesses.

This means that Beauchamp and his colleagues can analyze how much fake news was being shared before and after Facebook prioritized posts by friends. If fake news spread just as widely after Facebook emphasized posts from friends, it would mean that it’s people, not algorithms, causing the glut of misinformation.

Their research is funded by a grant program from the Social Science Research Council and Social Science One that’s designed to help scholars study Facebook’s impact on democracy worldwide.

Facebook has come under fierce scrutiny recently for allowing the political firm Cambridge Analytica to access the private data of millions of Facebook users, and for failing to block hundreds of fake accounts from running ads meant to influence the 2016 and 2018 U.S. elections.

Beauchamp says that Facebook’s willingness to share some of its data with researchers could help quell the spread of fake news.

“From my point of view, the content on Facebook tends to be rich and robust, and speaks to the actual demographics of the U.S.,” Beauchamp says. “It’s always been something that’s interested us; we just haven’t had access to it until now.”

For media inquiries, please contact Mike Woeste at m.woeste@northeastern.edu or 617-373-5718.