Will scientists ever be able to simulate the full range of human behavior? And if they could, how could those models help us better understand ourselves?
These questions don’t propel the plot of the newest blockbuster sci-fi film; instead they’ve served as the seed for professor Stacy C. Marsella’s research—and he’s getting closer to answering them.
To tackle such an undertaking requires expertise both in psychology—to understand the complexities of human emotion and how it’s displayed—and computer science—in order to put all that information into a simulator that will function consistently and reliably.
Marsella is perhaps uniquely suited to take on such a task given the interdisciplinary nature of his appointments at Northeastern—he holds joint appointments in the College of Computer and Information Science and the Department of Psychology.
His research is in the computational modeling of emotion, cognition, and social behavior, and he discussed some of his pioneering work in building human behavior simulators and what implications it could have for the future on Monday in the Raytheon Amphitheater, where he delivered the 53rd annual Robert D. Klein Lecture.
The award, given each year to a member of the teaching faculty who has contributed with distinction to his or her field of study, was established in 1964. The lectureship was renamed in 1979 in tribute to the late Robert D. Klein, professor of mathematics, chairman of the Faculty Senate Agenda Committee, and vice chairman of the Faculty Senate.
“There’s a long history in artificial intelligence of looking at the question of, ‘What is the function of emotion in human behavior?’ And then, ‘Can we build more robust artificial intelligence if we understand those applications?’” Marsella told a crowd of students, faculty, and staff.
Some of the applications of his pioneering work in building human behavior simulators have already been realized. Marsella has worked with scientists to build large-scale models of cities in order to simulate pre- and post-natural disaster responses, with the goal of optimizing those responses.
“I look at the human element—how do people respond to things like a coming hurricane? Do they evacuate? And if so, when?” Marsella said.
He’s also built virtual humans that have the ability to respond to their environments and use dialogue to interact with living people—abilities that make them ideal for inferring whether a human on the other end of a conversation may suffer from depression or post-traumatic stress disorder.
These types of virtual humans have also been used in programs to teach new doctors how to break bad news to ill patients, effectively improving the doctors’ bedside manner before they interact with them.
“There’s a lot of knowledge about human behavior that goes into building something like that,” Marsella said. “For example, how should the virtual human express emotion? There are nonverbal things like body language and facial reactions to take into account as well.”
He’s been able to create this simulated behavior by breaking down human behavior to its core emotional motivation. Emotion has a profound impact on human behavior, Marsella said. It shapes the body, as someone prepares to respond to external stimuli; it shapes the mind by interrupting other behaviors and influencing goals; it shapes others’ minds by signaling anger, for example; and it coordinates behaviors in respect to each other.
In one striking example, Marsella showed a video of a group of actors rehearsing for a play. Unexpectedly, a bird flies through an open window and into the room. One woman in the video transitions from rehearsing, to preparing to attack the bird, to protecting others in the room from the rogue bird, to helping with its humane capture in less than three seconds.
“The AI side of me wonders, ‘How can I build an AI system that’s that robust, that flexible?’” Marsella said.
These sorts of questions highlight for Marsella that “the work I do here is inherently interdisciplinary.”