AI can’t land a plane on the Hudson River in an emergency or safely drive a car. A team is using drones to try to fix that.

A flying drone
A Northeastern-led team of collegiate researchers will use drones to help the U.S. Navy understand how to incorporate safety into artificial intelligence systems. The university won a $7.5 million research award to expand knowledge into machine learning. Photo by Adam Glanzman/Northeastern University

Despite years of progress with machine learning, the promise of self-driving cars and other autonomous systems remains elusive because of safety concerns.

A Northeastern-led team of universities has been awarded a $7.5 million research grant from the Defense Department to study ways of bolstering artificial intelligence to perform in dynamic, ever-changing environments and make correct decisions on the fly. AI typically performs well with controlled, repetitive tasks but more work is needed to understand its potential under unexpected conditions it has not seen before.

Mario Sznaier, Dennis Picard trustee professor of electrical and computer engineering at Northeastern. Courtesy photo

“There have been tremendous advancements in artificial intelligence, yet the

level of autonomy is very low, that’s why you still don’t see self-driving cars on the road,” says Mario Sznaier, Dennis Picard trustee professor of electrical and computer engineering at Northeastern and the lead researcher on the project.

“I could make a car that learns the environment by bumping into buildings or bumping into people. That’s not acceptable,” Sznaier says. “You need an AI that can guarantee that under no conditions will humans be harmed. That’s the problem we’re trying to solve.”

The Office of Naval Research, headquartered near Washington, DC, chose researchers from Northeastern, Johns Hopkins University, the University of California-Berkeley, and the University of Michigan in a highly competitive contest to begin exploring machine learning systems that guarantee safety and performance.

“The Navy has a large program on the science of autonomy, and so we’re pushing the boundaries there,” Sznaier says. His team at Northeastern includes electrical

Left to right: Eduardo Sontag, university distinguished professor of electrical and computer engineering and bioengineering; Octavia Camps, professor of electrical and computer engineering; and Milad Siami, assistant professor of electrical and computer engineering. Photos by Northeastern University

and computer engineering faculty, professor Octavia Camps, assistant professor Milad Siami, and Eduardo Sontag, university distinguished professor of electrical and computer engineering and bioengineering.

They may test different theories using drones that protect coastlines, but the overarching goal is integrating AI that safely adapts to unseen scenarios.

Sznaier compares the effort to the sudden emergency landing of a commercial airliner in New York’s Hudson River when the plane’s engines lost power after hitting a flock of birds. The captain of that U.S. Airways flight, Chesley “Sully” Sullenberger, safely landed the aircraft in the 2009 incident that came to be known as the “Miracle on the Hudson.”

“Our current AI would not have been able to land the plane because it wasn’t trained to do that and cannot make decisions like that,” Sznaier says.

Researchers are taking a multi-disciplined approach, a key objective of the U.S. military program. The team of eight people includes biologists who study electric fish such as eels that use sensors to gauge their immediate surroundings. “Part of what we’re trying to learn is how to manipulate our behavior to improve how we sense the environment,” Sznaier says.

Findings may help the military on long-term missions that involve unpredictable scenarios. “Suppose that you have a drone that gets shot at and sustains some damage,” he adds. “The airborne vehicle will have to reassess on the fly—’What can I accomplish with the damage and can I complete the mission and still return home?’”

The U.S. Navy grant provides guaranteed funding for five years, an important advantage that allows for long-term planning, Sznaier points out. And, it aligns with the university’s focus on machine learning.

Northeastern invested $50 million in a new artificial intelligence research institute last year and is leading a new Massachusetts program, AI Jump Start, to connect small business owners in the state with academic faculty experts to learn how machine learning can grow their companies. The university recently co-sponsored a conference to examine the legal and ethical ramifications of artificial intelligence.  

For media inquiries, please contact media@northeastern.edu.