Tired of being manipulated by fake news? This Northeastern researcher is creating the tools we need to fight against it. by Bill Ibelle June 18, 2018 Share Facebook LinkedIn Twitter Briony Swire-Thompson, a postdoctoral fellow, is on the frontline in the battle against misinformation. Photo by Matthew Modoono/Northeastern University Misinformation. Propaganda. Fake news. We are swimming in a sea of information that may or may not be true, yet this information provides the bedrock for our beliefs about politics, policy, education, health, science, and more. “With the rise of social media, far more misinformation is getting to many more people,” said Briony Swire-Thompson, a postdoctoral researcher at Northeastern, whose work focuses on why people believe in false information, and how corrective information can be presented in a way to make it most effective. Whether we know the formal term or not, we all understand the concept of motivational reasoning—that people grant validity to information that supports their opinions while discounting information that contradicts those beliefs. But Swire-Thompson’s early research established that there’s another factor at play—memory. To investigate this, she devised a study based on common beliefs that hold no emotional power for people. The goal was to eliminate motivational reasoning from the study. Swire-Thompson was recently named by the prestigious Poynter Institute for Media Study as one of six academics nationwide who are “on the frontlines of fake news research.” She works in the lab of David Lazer, Distinguished Professor with joint appointments in the Department of Political Science and College of Computer and Information Science. “I spent the first several weeks of my research finding these myths and determining whether they are true,” she said. “It was a blast.” The study included myths such as “eating carrots improve your eyesight,”, “you can spot a liar by the way they move their eyes,” and “peeing on a jellyfish sting eliminates the pain.” She had subjects rate their confidence in these statements on a scale from 0 to 10. She then provided corrective information from a reputable source and had the subjects rate the veracity of the statements a second time. She found that the majority of the subjects modified their opinion based on the corrective information. But when those same subjects were tested one week later, their ratings had moved dramatically back toward their original opinion. This ‘re-believing’ was even stronger after three weeks. “My conclusion is that corrective information is already at a disadvantage due to the limitations in our memory, and on top of that, we have our ideological biases to deal with,” she said. The impact of ideological bias was the subject of a 2017 paper she published in Royal Society Open Science, a peer-reviewed scientific journal. This time she tested the impact of corrective information on a subject that involves strong ideological beliefs—misstatements made by Donald Trump during the 2016 presidential campaign. Corrective information is already at a disadvantage due to the limitations in our memory, and on top of that, we have our ideological biases to deal with. Briony Swire-Thompson, Behavioral Psychologist The experiment involved more than 2,000 subjects. A mixture of Trump supporters and non-supporters were asked to rate their confidence in eight statements that Trump made during the campaign—four of them true and four false. To further test the Trump phenomenon, half the respondents were told the statements were made by Trump and half were given the statements with no attribution. As in the previous experiment, the subjects were asked to rate their confidence in each statement twice—upon first reading it, and either a) immediately after receiving reliable evidence that the statement was either true or false, or b) a week after receiving the corrective information. The following are a small sample of her findings: People of all political persuasions were willing to modify their beliefs based on corrective information from reliable sources. That was the good news, according to Briony-Thompson. However, as in her earlier study, subjects “re-believed” the false information when retested a week later. And this wasn’t limited to Trump supporters. When Democrats encountered true statements attributed to Trump, they assumed they were false. They adjusted this error based on the corrective information, but a week later, they were trending back toward their original belief that Trump’s accurate statements were untrue. Even when subjects acknowledged their candidate made untrue statements, it had virtually no impact on their voting intentions. When the statements were attributed to Trump, both supporters and non-supporters had more trouble separating fact from fiction a week after the initial test. Among those who received the statements unattributed, the confusion was much less pronounced. Lessons and Questions The field of correcting misinformation remains largely uncharted territory. In addition to publishing her own research, Swire-Thompson has written book chapters and articles that distill the available research on how to minimize the impact of misinformation. Here are a few examples: Provide factual alternatives People are more likely to modify their belief if the corrective information includes an alternative explanation. In a study where subjects are initially led to believe a warehouse fire was caused by negligently stored gas canisters, subjects were more likely to accept (and remember) that this is incorrect if they are told the police have found evidence of arson. Photo by Matthew Modoono/Northeastern University “Corrective information is much more effective when people can ‘switch out’ explanations,” said Swire-Thompson. “Just don’t say, ‘This is incorrect. ’ If you remove something, it’s important to fill the hole.” Repeat retractions Since automatic memory equates familiarity with truth, it’s important that the corrective information be repeated as frequently, and with even greater clarity, than the myth. Tailor the correction to the audience When incorrect information supports a person’s values and worldview, it’s essential that the corrective information be presented in an unthreatening manner. Numerous studies have shown that corrections can actually strengthen a person’s belief in misinformation because the correction challenges their world view and forces them into a defensive posture. For example, when presenting information about climate change to conservatives, studies have shown that it’s effective to frame the correction as a business opportunity rather than lecturing people about society’s bad behavior. Activate strategic memory Dual process theory asserts that there are two types of memory: automatic and strategic. Automatic memory is fast and effortless, but doesn’t include an accuracy filter. The perceived veracity of automatic memory is based exclusively on familiarity. If misinformation is repeated from multiple sources, automatic memory perceives it to be true. Therefore, when this type of memory is retrieved, there is often no recall of the correction. “Memory doesn’t work like a whiteboard that can be erased,” said Swire-Thompson. “Invalidated information is not deleted.” Strategic memory requires more effort to activate. The subject has to be motivated to put forth the effort. But because automatic memory often does not include corrections, activating strategic memory is essential to combat misinformation. Train informed consumers One way to do this is to make healthy skepticism a fundamental element of school curriculum. Another way is to warn people in advance of receiving corrective information that they are likely to forget the correction. This helps them mentally tag the bogus information as false. When it comes to controlling propaganda, misinformation, and fake news, we still have a long way to go, said Swire-Thompson. “I want to keep digging in these areas. If we can understand the circumstances that allow people to update their beliefs, we can help science communicators become more effective.”