To stop fake news, researchers call for internet platforms to choose quality over quantity

“Fake news” has made headlines and dominated social media chatter since the 2016 presidential election. It appears to be everywhere, and researchers are still determining the scale of the problem, said David Lazer, Distinguished Professor of Political Science and Computer and Information Sciences at Northeastern.

Lazer, along with Matthew Baum of the Harvard Kennedy School, is the lead author on a perspective paper published Thursday in Science, in which he and his colleagues examined fake news and its prevalence and impact across Google, Facebook, and Twitter. They offered recommendations for stemming its flow and influence, and in particular called for more interdisciplinary research—including more collaboration between internet platforms and academia—“to reduce the spread of fake news and to address the underlying pathologies it has revealed.”

“There needs to be some regular auditing of what the platforms are doing and how much this information is spreading, because there is a collective interest in the quality of the information ecosystem that we all live in,” Lazer said.

The authors defined fake news as “fabricated information that mimics news media content in form but not in organizational process or intent.”

One problem Lazer and his colleagues point out is that platforms have been accidentally complicit in amplifying fake news. For example, take Twitter’s trending mechanism. When the platform notices a surge in tweets about a particular topic—like a celebrity’s birthday or an approaching nor’easter—Twitter may list the topic as trending. But the algorithm that identifies trends can be manipulated. For example, a recent BBC investigation found that for $200, a company in Saudi Arabia will deploy its army of bots to make any hashtag trend for a few hours.

Lazer and his co-authors cited previous studies estimating that between 9 and 15 percent of active Twitter accounts are bots, and that there may be as many as 60 million bots on Facebook. They called upon the platforms to police this kind of activity by identifying false content and ensuring that it doesn’t become amplified. “Generally, the platforms should avoid accidentally amplifying low-quality content when detecting what is trending,” Lazer said. “That seems like a no-brainer.” But historically, the design of social media platforms has prioritized engagement over quality of content. That will need to change, Lazer said, to effectively combat fake news.

In October 2017, executives from Twitter, Facebook, and Google appeared on Capitol Hill to acknowledge their platforms’ role in enabling the spread of misinformation by Russian agencies during the 2016 presidential campaign. Since then, Twitter has tightened its policies on automation, suspending thousands of suspected bot accounts last month. And Facebook has ended an experiment with its News Feed that ran in six countries and was shown to be magnifying fake news.

The researchers acknowledged that Google, Facebook, and Twitter have made attempts to counteract fake news, pointing to Twitter’s move to block certain accounts linked to Russian misinformation and Facebook’s plan to shift its algorithm to account for “quality” in its content curation. But the authors said these platforms have not provided enough detail for evaluation by the research community.

The researchers cited a previous study showing that about 47 percent of Americans “often or sometimes” get their news from social media, with Facebook being the dominant source. With this in mind, platforms should take the initiative to integrate ways of evaluating the publishers and sharers of information, Lazer said. Otherwise, there is a chance the government will embrace that role.

“That creates potential for government abuse by tilting the perspectives that can be shown on the internet,” Lazer said. “There are reasons to be worried about government intervention in this space.” His hope instead is that the threat of government regulation will cause the platforms themselves to take action against fake news.