The fake news phenomenon: How it spreads, and how to fight it

This past weekend, Northeastern and Harvard convened researchers and experts across disciplines to examine the fake news phenomenon and find solutions. Photos via iStock.

This past weekend, Northeastern University and Harvard University co-hosted a conference on fake news and how to combat it. Experts ranging from network scientists and psychologists to social media gurus, journalists, and political scientists convened at Harvard’s law school to bring their knowledge to bear and determine what can be done to stem the pernicious flow of misinformation.

“A well-functioning democracy requires a healthy ecosystem of truth-tellers,” said David Lazer, Distinguished Professor of Political Science and Computer and Information Science at Northeastern and co-director of the university’s NULab for Texts, Maps, and Networks. “Citizens need to be informed, and we need institutions to communicate what’s happening in the world. In a democracy, we have to respect these enduring differences in a body politick, but the fact that there’s legitimate diversity doesn’t mean that all presentations of reality are legitimate.”

Along with Harvard professor Matthew Baum, Lazer co-organized the conference, which he said was aimed at catalyzing research and action on this issue. Here are some of the highlights from the conference, which featured several panel discussions.

What is fake news?

Researchers acknowledged the difficulty in defining what, exactly, qualifies as fake news. Lazer defined it as “a subgenre of misinformation,” calling it “information regarding the state of the world that’s constructed with disregard of the facts and invokes the symbols of existing truth-tellers. It misinforms by appealing to the very worst of human nature, and undermines truth-tellers at the same time.”

Harvard Law School researcher Yochai Benkler argued that there is an urgent need to distinguish fake news from propaganda.

“A guy sitting in his kitchen making (stuff) up so he can get $5,000 in ad money from Facebook is fake news,” Benkler said, while a person purposely distributing misinformation for political gain is propaganda.

“What happens to democracy in the face of intentional efforts to deny the validity of basic methods of defining the range of reasonable disagreement?” he asked. “That’s the big deal, not the guy sitting in the kitchen.”

The psychology of it: How do people come to believe fake news?

Familiarity

One major factor in the way people determine whether news is accurate is whether they’ve come across it before, several of the researchers posited.

“The more people are exposed to fake news, the less they’re able to discern that its fake,” said Gordon Pennycook, Banting Postdoctoral Fellow at Yale University.

In one study, Pennycook introduced participants to various real and fake news stories then asked them to rate their accuracy. He found that familiar fake news stories were considered more accurate than unfamiliar real news stories.

“Familiarity is a heuristic,” he said, referring to the cognitive shortcuts everyone takes in making decisions. “If something is familiar, it’s easier to process, and if it’s easier to process, it feels right.”

Northeastern network scientist David Lazer speaks at the podium during this weekend’s fake news conference, co-hosted by Northeastern and Harvard. Lazer was a co-organizer of the conference. Photo by Molly Callahan for Northeastern University

One big key to stemming the pernicious spread of fake news is not to repeat falsehoods even when correcting them, said Adam Berinsky, professor of political science at the Massachusetts Institute of Technology.

“People just remember hearing things, they don’t always remember whether those things were true or false,” Berinsky said. Finding a way to fact-check the information without repeating the false part of it is important. One effective way Berinsky has found to do so is to find sources with a stake in the information to counter it. For example, Berinsky found that misinformation that skewed favorably toward the right was better contradicted by a right-wing politician rather than a nonpartisan organization.

Social community

The views held in one’s own social community also have a profound effect on the way individuals evaluate news.

“I’ve been studying how people think for a long time,” said Steven Sloman, professor of cognitive, linguistic, and psychological sciences at Brown University, “and the main conclusion I’ve found is that people don’t think very much. Mostly what they do is channel the knowledge of those around them.”

Further, people are less likely to question information coming from a source they trust within their community, according to Miriam Metzger, professor of communication and information technologies at the University of California, Santa Barbara.

These are the challenges before us, and they’re formidable challenges

Miriam Metzger, Professor of communication and information technologies at the University of California, Santa Barbara

This includes news sources with which they’re familiar as well as individual people. For example, a person is more likely to trust an article shared online by someone they trust, whether or not that article actually contains factual information. This, Sloman said, would indicate that the problem is best addressed by changing our culture and our social norms rather than changing individuals.

Partisan bias

The consumption of fake news tends to split along party lines, and our decision to believe it or not also depends on whether it reinforces what we already believe, researchers found.

Metzger said people are generally prone to self-confirmation bias; in other words, “if you’re seeing biased information, you might think it’s plausible because it confirms what you already think. This makes it difficult to recognize biased information,” she said.

Combating all this? “These are the challenges before us, and they’re formidable challenges,” Metzger said.

Can we un-ring the bell?

In short, according to Boston College researcher Emily Thorson, no.

Thorson, an assistant professor of political science, conducted a study in which participants were given an article about a hypothetical political candidate that contained negative, albeit blatantly incorrect, information that was corrected immediately. Largely, participants accepted the correction and viewed the misinformation as false.

However, when participants were later asked their opinions on that candidate, they generally rated him more negatively than participants who were never given the false information.

“When we process a piece of information, especially when it’s vivid or emotionally stimulating, it has an instantaneous effect on our attitudes,” Thorson said. “Cognitively we know it’s false, but that’s not enough to erase the emotional, subconscious effect. Just being exposed to misinformation is potentially enough to shape people’s views.”

The imperative here—similar to conclusions drawn by Pennycook and Berinsky—is on the mass media not to broadcast misinformation even in the context of correcting it.

How fake news spreads

Historical context

The dissemination of misinformation for political and social gain—even in broadly popular media organizations—is not new.

Columbia University journalism professor Michael Schudson and Kelly Greenhill, a researcher with Tufts and the Harvard Belfer Center, both elucidated numerous instances throughout American history of such purposeful dissemination.

“Such historical examples should make clear that use of fake news to influence political systems is longstanding,” Greenhill said. “The technology has changed, but the tactics have not. Is this disturbing? Is this worse than it should be? Yes, but perhaps it can give us hope that we can emerge from this like we have in the past.”

Just being exposed to misinformation is potentially enough to shape people’s views.

Emily Thorson, Boston College researcher

Why have people believed fake news again and again?

Schudson posits that fear, anxiety, and resentment are powerful emotions that make people susceptible to fake news. Another important factor: “Endorsement by people we normally believe to be credible authorities makes fake news more believable,” he added.

Small concentration of sharers

One such person is the president.

Filippo Menczer, professor of informatics and computer science at Indiana University, has tracked the role of social media in disseminating information, particularly how it is spread inorganically through automatic bots.

“In one day, Donald Trump got hundreds and hundreds of mentions on Twitter (from bot accounts) with a link to a fake news site about 3 million illegal voters casting votes in the election,” Menczer said. “Within a few days, he stated that as fact.”

22,000/70/15
Among 22,000 Twitters users, 70 percent of fake news in the months before the election was spread by 15 people

The impact of a few people on social media can be profound.

In one of his studies involving 22,000 Twitter users, Lazar found that 70 percent of the fake news spread in the months leading up to the election was spread by a mere 15 people.

When these were people with broad social reach, Duncan Watts, a principal researcher at Microsoft, called it “the Justin Bieber effect.”

“These are people who can take media that would otherwise die out and give it some extra life,” he said. His research has found that while engineering truly viral content is extremely difficult, creating popular content, that’s broadcast to a wide audience, is not.

He added: “Mass media and celebrities are still the key to dissemination; engineering viral content is still hard.”

What now?

Changing the social media landscape

Eli Pariser, chief executive and co-founder of UpWorthy, said determining the next steps will depend on the specific problem we’re trying to solve—a solution that creates more engaged citizens will be different from one that creates more informed citizens.

This is because, as Lazer’s research shows, social media users spreading fake news are already extremely engaged. Whether they’re informed or not is a different matter.

“From my view,” Pariser said, “Reducing misperceptions is important, but it’s less important than the question of: ‘Do people actually have information they need to participate in a good democracy?’”

Adam Sharp, the former head of news, government, and elections for Twitter, added that perhaps the answer would require a drastic shift in how social media platforms operate.

“A business model built on driving user growth or user engagement probably isn’t the one best suited to distribute the most factual information,” he said.