That was the number of school shootings in the first six weeks of 2018, according to dozens of leading news outlets after a Feb. 14 massacre at a South Florida high school left 17 dead.
The number produced a collective gasp across the county. It was shocking, outrageous, unacceptable—and as it turned out, untrue.
The error does nothing to diminish the magnitude of the tragedy in Parkland, Florida. But it does raise questions about how so many top news organizations could run with a number that, at first glance, strained credibility, and under scrutiny proved to be a significant exaggeration.
Matt Carroll, a Pulitzer Prize-winning data journalist and professor of the practice at Northeastern, is in a unique position to understand how this could happen and how similar data errors can be prevented. His diagnosis is that it was the product of a combination of challenges facing modern journalism—speed, pressure, competition, and the deceptive precision of numbers.
“It’s the seduction of numbers—especially specific numbers,” says Carroll, who has made his career as a numbers specialist. “If an article or tweet says ‘about 20’ it’s going to be received as less of a hard fact than when you have an exact number like 18.”
How the statistic took hold
Fewer than two hours after the shooting at Marjory Stoneman Douglas High School, a nonprofit organization named Everytown for Gun Safety tweeted that it was the 18th school shooting in the U.S. in 2018. The number was retweeted by several prominent politicians, reporters, and celebrities, then spread like a wildfire, producing hundreds of thousands of likes in less than 24 hours.
The mainstream media quickly ran with the stunning “fact,” which was published by MSNBC, CNN, ABC, NBC, CBS, Time, the BBC, the New York Daily News, the Huffington Post, and more.
It wasn’t until a day later that The Washington Post published a story the next day saying the number was highly misleading.
As the Post pointed out, most of those 18 incidents don’t qualify as school shootings based on the common understanding of that term. One involved a man who committed suicide in the parking lot of a school that had been closed for seven months. Another involved a 32-year-old-man who was shot in the parking lot during a basketball game, while a third occurred during an after-hours robbery in a school parking lot.
Generally, the phrase “school shooting” is understood to mean that someone was shot. Yet only eight of the 18 shootings involved injuries—and one of those was a suicide in the boy’s room.
How did data that was so misleading end up being used by so many reputable news organizations?
Carroll, who won a 2003 Pulitzer Prize as the data specialist on The Boston Globe’s Spotlight team that broke the Catholic Church sex scandal, says that it probably happened like this: In a rush to create a broader context to the Florida shooting, a reporter decides to use the number. An editor then sees the eye-popping number and decides to hype it by moving it higher in the story. Once the first reputable media outlet goes live with the number, others assume it must be accurate and decide to run with it as well.
“You have to remember,” says Carroll, “that things are happening extremely fast, and this is a high pressure environment.”
Data reliability in a fast-paced world
That pressure cooker environment isn’t going away, says Carroll, so journalists need to find ways to establish data reliability on the fly. Based on his 26 years as a data specialist with the Globe, here are a few of his suggestions:
1) Be skeptical
“If the number seems too good to be true, it’s often because it isn’t,” he says. “You’ve got to ask yourself, ‘If there have been 18 school shootings in fewer than two months, why haven’t I heard of them?’”
2) Evaluate the source
“Are the numbers coming from a government agency, a source that’s known for being reliable, or a from an organization that has a clear agenda and may have a reason to cook the books?”
In this case, the shocking numbers came from Everytown for Gun Safety, a nonprofit formed in 2014 to promote gun control and reduce gun violence.
3) Examine the data
“You’ve got to look, not at the number, but at the data behind the number,” says Carroll. “If you put sausage into the data machine, it’s going to be sausage that comes out.”
Has the organization published its data? If not, that’s a warning sign. If it has been published, review how the group came up with its numbers. Do its conclusions match the information, or do they distort or exaggerate that information in some way? If the data is based on a survey, evaluate whether questions or response options are phrased in a way that encourages a certain reply.
“If you can’t examine the data, you can’t publish it as accurate—or if you do, you have to caveat the hell out of it,” says Carroll.
4) Verify the assertions with your own research
At the very least, reporters should have searched their own news files and the internet to see if they could verify these 18 shootings.
Search reliable news accounts to see how many of the 18 school shootings you can find. “If you can only find stories on five school shootings, that should set off alarm bells.”
5) Find other studies
Even under time pressure, reporters can search to see if others sources—academic research, government agencies, private think tanks, or nonprofits—have produced numbers that support the figures in question.
6) Have the courage of your convictions
If you’re uncomfortable with data, you need to be willing to present your doubts to your editor, which in some cases isn’t going to be popular.
“You have to remember, it’s better to hold a story than to get it wrong,” says Carroll.