Who’s regulating the autonomous weapons systems that are changing the nature of warfare?

Haner poses with a drone
“No one is developing Terminator-style robots,” says Justin Haner, a doctoral candidate at Northeastern who studies autonomous weapons systems. “But [artificial intelligence] is being added incrementally to existing systems so that drones can fly on their own and target things on their own, for example.” Photo by Ruby Wallau/Northeastern University

Countries around the world are pouring billions of dollars into developing autonomous weapons systems—weapons that are equipped with predictive, decision-making abilities, courtesy of artificial intelligence. 

Proponents of autonomous weaponry argue that they will keep soldiers out of harm’s way by keeping them off the battlefield, says Justin Haner, a doctoral candidate at Northeastern University who studies autonomous weapons systems. But he thinks the argument is short-sighted. A country with an increasingly robotic arms force might send those machines into conflicts that it would not send human soldiers, Haner says.

portrait of Justin Haner

Justin Haner is a doctoral candidate at Northeastern who studies autonomous weapons systems. Photo by Ruby Wallau/Northeastern University

Haner works with Denise Garcia, who is an associate professor of political science and international affairs and who sits on the International Committee for Robot Arms Control. Haner and Garcia recently published a paper that examines trends in the development of autonomous weapons by countries around the world.

They found that there are five world leaders in the development of autonomous weaponry—the United States, China, Russia, South Korea, and the European Union—and they’re each pouring billions of dollars into this arms race.  

Autonomous weapons are devices that can survey their surroundings, identify potential enemy targets, and independently choose to attack those targets on the basis of sophisticated algorithms,” according to the Arms Control Association, a nonpartisan organization that lobbies for arms control policies.  

“No one is developing Terminator-style robots,” says Haner, “but [artificial intelligence] is being added incrementally to existing systems so that drones can fly on their own and target things on their own, for example.”

In the U.S. a Department of Defense policy from 2012 allows for semi-autonomous systems to “engage targets” that were pre-selected by human operators. 

In 2018, a Russian state-run news agency announced that Russian forces had deployed a remote-controlled robotic tank to Syria.

Despite all this, Haner and Garcia found that there is little oversight or regulation when it comes to developing autonomous weapons, and even less public discourse about it. 

“These things are happening now, and it’s really crucial that we start having conversations about what to do about it,” Haner says.

Haner is concerned about other possible negative consequences of such technology.

For example, the mitigation of human harm might make the stakes lower for countries that are considering entry into a foreign conflict, he says. 

“Each new foreign conflict entanglement risks unintended escalation and could result in the [country] finding itself forced into a deadly large-scale war that could have been avoided entirely if we had not developed lethal robotic systems which were sent into conflicts under circumstances in which we would not have sent soldiers,” Haner says.

The proliferation of autonomous weaponry could also give “terrorists greater ability to choose their target,” Haner says. “They’re also much less likely to get caught carrying out an attack if they’re using AI-enhanced drones” as opposed to being onsite, he adds. 

Artificial intelligence is also prone to the same biases as the people creating it. If, for example, facial recognition software has been trained using mostly white faces, there’s a higher likelihood that it will make a false positive when scanning the faces of people of color, Haner says. 

“It could lead to the over-targeting of minority groups,” he says. 

Haner and Garcia argue that countries around the world—and particularly the five powers leading the autonomous arms race—“should attempt to set global norms and push for a ban on the use of autonomous weapons systems now,” according to their paper. 

“We have to move quickly,” Haner says. 

For media inquiries, please contact Marirose Sartoretto at m.sartoretto@northeastern.edu or 617-373-5718.,