ICE facial recognition search of driver’s license photos ‘damaging’ to minorities, says Northeastern University professor

Artificial intelligence software, such as that used by Immigration and Customs Enforcement to scan state driver’s license databases, can be prone to errors and therefore result in higher rates of misidentification and false arrests, says Northeastern associate professor Kay Mathieson. AP Photo/LM Otero, File)

Facial recognition software is coming into increasingly common use as a high-tech security measure. But is it always ethical?

The question is in the news following a report that agents with the United States Immigration and Customs Enforcement have been surreptitiously using facial recognition technology to scan driver’s license photos for information about immigrants who are in the country illegally. 

Kay Mathiesen, an associate professor of philosophy and religion at Northeastern focused on information and computer ethics and justice, condemned the practice as “damaging” to all citizens, but especially minorities, whom she worries will be unfairly targeted by law enforcement.

Kay Mathiesen, an associate professor of philosophy and religion at Northeastern who researches computer ethics and justice, has expressed concerns about the way that federal investigators are using facial recognition technology. Photo by Matthew Modoono/Northeastern University

Artificial intelligence can be prone to errors, she says, and therefore result in higher rates of misidentification and false arrests. Stricter oversight of the software development process could help to reduce inaccuracies, she says. 

“There’s the issue of how accurate are these systems, and it looks like there’s good reason to think they’re not very accurate,” she says. “They’re more inaccurate when it comes to women and minorities, so you’re going to get way more false-positive matches. You already have the case of over-policing of minority communities, so this is going to exacerbate that.”

A new global survey conducted by Northeastern University and Gallup revealed that 70 percent of U.S. respondents expressed concern about their personal information being at risk when using products or services that employ artificial intelligence. That’s a good sign, says Mathiesen.

“I think people are starting to be more aware of what the issues are and how this information can be used,” she says. “That’s actually a really hopeful sign that there’s going to be more of a pushback, because we’re way behind the curve when it comes to regulating how data is used, and how algorithms are used.”

The use of facial recognition technology has come under criticism from both Democratic and Republican lawmakers, who have raised concerns about the threat it poses to civil liberties. Congress and state legislatures have not approved the practice, which has also been employed by the Federal Bureau of Investigations agents.

The facial recognition story, first reported on by The Washington Post, revealed that from 2014 to 2017 ICE officials requested and were granted access to Department of Motor Vehicles driver’s license databases in Utah, Vermont, and Washington—all states that allow people in the U.S. without the proper residence documents to obtain driver’s licenses—without the knowledge or consent of motorists. Vermont officials discontinued its searches in 2017 at the behest of its governor Phil Scott.

“The DMV is not an arm of law enforcement or the FBI or ICE,” Mathiesen says. “There’s no reason for people to think that that information is going to be shared with these other law enforcement agencies. The most fundamental notion in any kind of ethics is the idea of informed consent, and that was missing.”

Mathiesen says she’s also concerned about how secure the databases maintained by the DMV are, and wonders if they are susceptible to being hacked by criminals or foreign governments. She supports the implementation of court orders, search and seizure warrants, and other legal safeguards in order to limit the use of facial recognition technology by federal and local law enforcement agencies.

Mathiesen also took to task ICE’s use of facial recognition technology to search driver’s license databases as a privacy violation and breach of civil liberties. 

In a Democratic society, she says, citizens should be able to make informed decisions about what kind of powers they want their law enforcement and government agencies to have, and they can only do so if they know exactly how their information is being used.

“I think what we really need is to have the transparency of exactly what’s happening so we can have a democratic conversation about what we want as a society the balance to be between security and privacy,” she says.

Mathiesen says that her biggest concern is that people will become complacent about unchecked government surveillance, and willingly risk compromising their privacy rights and civil liberties for the sake of national security.

“I think it’s really troubling if we are people in a society where that’s what we accept, that we want to be taken care of by a benevolent, all-powerful, all-knowing force,” she says. “It really seems to undercut the idea of individual autonomy and individual liberty, because if you believe that you’re constantly being watched, it can’t help but change your behavior.”

For media inquiries, please contact media@northeastern.edu.