Skip to content

A Northeastern professor, two co-ops and a robot go into West Village…

Computer science professor Zhi Tan studies human-robot interaction. A busy building on the Boston campus serves as a training ground for his lab’s helper robots — and the students writing their code.

Zhi Tan following a robot named Marlo.
Computer science professor Zhi Tan follows a robot nicknamed ‘Marlo’ to the restrooms of the West Village H Northeastern’s Boston campus. Photo by Alyssa Stone/Northeastern University

Marlo is new to Northeastern University’s Boston campus, and he’s slowly learning his way around. Very, very slowly.

It’s taken a few outings, but he has the lay of the land on the ground floor of the West Village H residential building, which has classrooms and study spaces for the Khoury College of Computer Sciences. He knows where the bathrooms and the exits are; he’ll lead you to them if asked. He knows to steer clear of obstacles like trash cans and the large staircase in the building’s main corridor.

But for now, he sticks to the first floor. “We haven’t figured out elevators yet,” says Zhi Tan, an assistant professor of computer science. “We’re running into a bit of a problem where the elevator [goes] too quickly.”

“Marlo” is a robot. Specifically, he’s a Hello Stretch 3, a consumer model that comes with a heavy wheeled base, camera sensor and grip arm and is equipped in the lab with artificial intelligence. Tan studies human-robot interaction at Khoury; the West Village space serves as a training ground for Marlo — not an official name, but a nickname bestowed by the students in Tan’s lab — as he learns to become a “helper robot” in the building.

It takes a while to get going. “I have some trouble,” Marlo says when Tan makes the morning’s initial attempts to show off the robot’s capabilities. After a few gentle prompts, Marlo responds, “I will take you!” and slowly leads a small group of onlookers to a bathroom around the corner.

A few months ago, Tan’s team put up a whiteboard for students to suggest things they’d like a helper robot to do. The answers ranged from menial tasks like filling up water bottles and picking up trash to high-level guidance, like giving information on available study spaces in the building.

Marlo can’t do any of that yet. Before he can, the robot needs a lot of practice navigating the real-world environment in which he’ll be asked to perform.

“We’re trying to understand how robots actually interact with people in various settings, whether it’s in factories or in the home,” Tan says. “Those are questions we cannot answer in the lab.”

The West Village project, then, represents an experiential learning opportunity for both man and machine. Marlo, which has been programmed with a large language model and movement recognition capabilities, is gathering more and more data on the humans who roam the halls —and how to best respond to them — with every trip taken. The computer science students working in the lab are getting firsthand experience writing and adjusting Marlo’s code in response to that data.

“Something Zhi says about robotics is, you can do 90% of it in the lab,” says Althea Masetti Zanini, a third-year computer science major and co-op in Tan’s lab working on the project. “The last 10% is the world. You have to take it outside and just try.”

Tan’s overall research is largely geared toward helping people in caregiving capacities: engineering robots that can give an intuitive hand to hospital workers, help the visually impaired get around or enable the elderly to age in place in their homes.  

In the spring, his lab will start running experiments in a “smart apartment” on campus, testing and developing robots to help around the house. “We’re thinking about how robots can give assistance in a socially appropriate manner,” he says. “If a mug is on the edge of a table, maybe we can have the robot move it around,” for example.

But human beings are unpredictable, and no preemptive programming can anticipate every possible scenario a robot might encounter — even in a contained environment like a home or campus building. Roombas, the most ubiquitous examples of smart consumer robots, learn more about a floor plan with each vacuuming trip around a home. Robots operating out in the world, like food delivery robots and driverless cars, have learned a lot through failure. When food delivery robots first launched, there were a handful of reported incidents of them plowing, oblivious, through active crime scenes.

Portrait of Zhi Tan.
Zhi Tan, Northeastern assistant professor in the Khoury College of Computer Sciences, studies human–robot interaction. Photo by Alyssa Stone/Northeastern University

Tan completed his Ph.D. at Carnegie Mellon University in Pittsburgh, which ran an early pilot program for driverless cars. The vehicles use computer vision models to know what’s around them, and they were programmed to avoid road obstructions like large debris and animals. In an incident that has become legend in academic human-computer-interaction circles, the whole auto fleet was paralyzed one weekend because a Furry convention was in town — their programming hadn’t prepared them for an influx of humans dressed as animals. It was a scenario no researcher in a lab could have dreamed up.

“That was an interesting data point, right?” Tan says. “If the robots weren’t running that day, they would not have figured this out. These weird things happen.”

The team, including Masetti Zanini and third-year computer science and physics major Emily Taylor, have already learned a lot from trial and error during Marlo’s first few trips. The robot was programmed to give directions to various points around West Village H, for example, but the directions he delivers at present are highly technical.

“It will say, like, ‘turn to three o’clock and walk 250 feet’” Tan laughs. “We need to do something about that.”

Early on, the robot also had difficulty hearing people over the ambient noise of the busy building. “I have some trouble” he says when he can’t “hear” or understand directions. So recently, Taylor programmed Marlo with the ability to sense the base level of noise in the environment, then recognize and respond to voice requests at a higher volume.

“It’s made talking to it much easier,” Masetti Zanini says.

The two co-ops came into the West Village project with very different computer science backgrounds, and they’ve been learning a lot along with Marlo.

“I had no experience with robotics at all,” Taylor says. “I just knew I found it interesting.”

Over the course of the co-op, she’s gotten a crash course in mechanical engineering, learning how to debug physical firmware. Much of Marlo’s success depends on audio processing, so that has also made up a lot of her work in the lab.

Masetti Zanini, by contrast, was in their high school’s robotics club and knows their way around the “hands-on, building stuff” side of the project. Writing responsive code, however, is new.

“This has definitely been my first proper [computer science] job, where I’m applying things I’ve learned in class,” they say.

Masetti Zanini points out that the upper-level curriculum of Northeastern’s CS program challenges students with problems that mimic real life, and that helping Marlo over the course of the semester has been an ideal real-world challenge.  

“I’ve learned a lot about the issues with implementation that don’t come when you have things filtered for you.”

None of what Marlo can do at the moment — or will do in the near term — is at all necessary in West Village H. The bathrooms and the exits are easy enough to find; students can fill their own water bottles. But it’s useful to think of this low stakes training as “Helper Robot 101”: a prerequisite for assisting in more vital tasks.

In the future, for instance, Tan hopes his robots will be able to distinguish between people with different needs and disabilities, adjusting its behavior accordingly to make the building — and perhaps the campus — more accessible to them. He’s also interested in how robots might be able to work together, so the fleet of West Village robots will eventually expand. Their progress, Tan says, will accelerate with time.