Is the US ready for another pandemic? Northeastern scientist testifies to the need for greater preparedness

Mauricio Santillana speaks into a microphone.
Mauricio Santillana. Courtesy photo

Is the U.S. ready for another pandemic? 

It’s a question members of Congress convened last week to tackle. And one of Northeastern’s own machine learning experts, Mauricio Santillana, a professor of physics and network science, was in Washington D.C. to help shed light on U.S. preparedness from the standpoint of how to leverage big data to create better predictive models. 

Santillana, who leads Northeastern’s Machine Intelligence Group for the Betterment of Health and the Environment, testified last week before the Bipartisan Commission on Biodefense to help, among other things, “explore preparedness needs and efforts, new solutions to improve biosurveillance and data modernization,” the commission said

The commission is tasked with making recommendations to Congress to better fortify the federal government’s biodefense infrastructure. Biodefense is concerned with any and all threats to human beings, animals or agriculture. 

Speaking with Northeastern Global News, Santillana painted a concerning portrait of the government’s ability to respond to so-called biological threats—detailing a lax and “reactive” posture when it comes to planning for, and pre-empting, hazards like the kind posed by COVID-19. 

The problem, he says, is partially due to the U.S.’s antiquated data collection and processing methods, and partially because of the influence of politics on government funding. The nature of the U.S. electoral process creates a situation where issues such as climate change and pandemic preparedness take a backseat to more visible, “immediate” problems—a phenomenon researchers have tried to explain through a concept known as temporal discounting

“Because we have a cycle of politics, politicians want to invest money in something they show within their tenure,” Santillana says. 

The end result amounts to a kind of collective amnesia—a system that works best when already faced with a crisis, rather than one that is adequately equipped to deal with future ones. 

“When there is a crisis, we see an allocation of funds in a reactive way,” Santillana says. “But we’re not very good at allocating those same funds during ‘peace times,’ when we should be reflecting on whether we are ready for when something bad actually happens.”

When the pandemic was first declared and infections began spreading across the U.S., the situation came as a shock to the general public, Santillana says. That’s because the government was taken aback, too—ill-prepared for a reality many in the scientific community had already anticipated in their models.    

And to those who were watching carefully—Santillana and his collaborators—the writing was on the wall months before it became headline news.  

“It was very clear to us that [the COVID-19 pandemic] was going to be a very big crisis,” says Santillana, who also served as an expert witness to city officials in Boston and made policy recommendations when the pandemic first began. 

Santillana says the federal government needs to invest in upgrading its pandemic surveillance capabilities, including finding better methods to collect data that can be used to predict outbreaks and disease activity.

The way to do that, he says, is simple. Private tech companies routinely collect and aggregate users’ mobile phone and search data to conduct their business. The government should devise “socially responsible data-sharing” solutions in the form of legal agreements, that would facilitate access to relevant anonymized data streams, Santillana says. This data would be collected by these companies and utilized by decision makers, public health officials and academic partners during public health crises.

“We should be leveraging all of the platforms that these big tech companies have built to provide our citizens with better information about potential biological threats,” Santillana says. 

There have been a few examples of how government officials can harness the power of big data to track infectious disease outbreaks. One notable project, Google Flu Trends, sought to use Google search data to “nowcast” outbreaks of influenza based on the number of queries related to the respiratory virus. Launched in 2008—a year before the H1N1 pandemic was declared—the method “failed to reflect the degree to which we would see infections,” Santillana says.

“And then, GFT failed—and failed spectacularly—missing at the peak of the 2013 flu season by 140 percent,” David Lazer, university distinguished professor of political science and computer sciences at Northeastern, wrote for Wired. “When Google quietly euthanized the program … it turned the poster child of big data into the poster child of the foibles of big data.”

Santillana, whose background is in climate science, proposed a better method that leveraged weather forecasting tools to model disease spread—one that he says found traction   

“I became interested in doing something more where I could bring all of my quantitative skills into something that could help people make decisions,” Santillana says.

In October, the White House published a sprawling update to the federal government’s “national biodefense strategy” that, among other things, articulates a set of principles that would steer efforts toward greater pandemic preparedness.

The document is “a call to action for state, local, tribal and territorial entities, practitioners, physicians, scientists, educators, industry, and the international community to work together to elevate biological preparedness and response,” it reads. 

Tanner Stening is a Northeastern Global News reporter. Email him at t.stening@northeastern.edu. Follow him on Twitter @tstening90.