Featured
Misinformation about science skews decisions, weakens trust and harms health, according to a National Academies report.
Misinformation about science harms personal decisions, democracy and public policy, says Northeastern University professor David Lazer, who contributed to a National Academies of Sciences, Engineering and Medicine report released Thursday.
Lazer, a member of NASEM’s Committee on Understanding and Addressing Misinformation About Science, helped evaluate how misinformation spreads and conceptualized ways to limit its harm.
“Misinformation undermines choice, individual agency and democracy,” he says.
The study, funded by the National Science Foundation, defines misinformation and disinformation about science, highlights their impacts, proposes directions for future research and suggests interventions. Over more than two years, the committee examined serious cases of misinformation, especially in medicine.
“For example, the opioid crisis was driven by false claims that opioids like OxyContin weren’t addictive,” says Lazer, a distinguished professor of political science and computer sciences at Northeastern.
Purdue Pharma and other companies misled the public and undermined informed decision-making.
Lazer notes that science helps people understand the world, and misinformation skews choices, often against their best interests. It also affects policy.
“If leaders misunderstand science, policies may fail to align with public needs,” he says.
Social media plays a major role, but Lazer says misinformation from trusted sources like news outlets can have an even greater impact. Misreporting scientific findings can be more damaging than false claims on social media.
The report calls for targeted action, especially where misinformation poses serious risks to health and well-being.
“If people misuse or avoid medications due to misunderstanding, the consequences can be severe,” Lazer says.
Community trust is key to fighting misinformation.
“If people distrust science, they distrust scientific institutions, creating fertile ground for falsehoods,” Lazer says.
The report encourages better communication between scientific institutions and communities that are skeptical, marginalized or culturally diverse.
Counteracting misinformation requires systemic efforts, Lazer says, including tools to help people recognize false information. He highlights the challenge of addressing misinformation in complex systems, where interventions in one area can push problems elsewhere.
For example, during the 2020 election, Facebook penalized groups spreading false claims, but misinformation continued to spread through individual resharing.
“Groups faced accountability, but individuals did not,” Lazer says.
The report calls on multiple stakeholders, such as scientists, universities, civil society organizations, funders of scientific research, journalists and news media organizations, and social media platforms to act collectively in addressing misinformation and increasing supply, visibility and access to credible science information.