Virtual avatars want you to tell them about your pain. Is that the future of health care? 

A recent Northeastern study explored using virtual avatars to help conduct patient pain assessments. Photo by Matthew Modoono/Northeastern University

Venkata Leelakrishna Kodipunjula was dealing with a loss. 

Kodipunjula’s family dog, Snoopy, had just died back at his home in India, but he wasn’t able to say goodbye because he was at Northeastern’s Boston campus pursuing his master’s degree in computer science.

He needed someone to talk to to share his feelings. 

It was around that time when he came across a flier in the Snell Library seeking volunteers to take part in a study. The main criteria was that they were looking to speak to individuals 18 and older who had experienced any kind of emotional or physical pain the past two weeks.

Kodipunjula didn’t know much about what the study would entail, but he thought it was worth a shot to help him get some emotional relief. 

Just a few weeks later, he was sitting in a room in the Interdisciplinary Science and Engineering Complex in front of a monitor answering a few basic questions about his mental and physical state. 

He wasn’t speaking to a human, but instead a virtual avatar known as an embodied conversational agent. Dressed in a white lab coat with a stethoscope resting on her shoulders, the animated female character listened while Kodipunjula spoke about his troubles. 

“The experience made me feel better because I was able to speak about it out loud,” Kodipunjula says. “On my own, I wouldn’t have spoken that much about it, but I had to share the details, and it made me feel better.” 

Headshot of Timothy Bickmore.
Timothy Bickmore, a professor of computer sciences at Northeastern. Photo by Ruby Wallau/Northeastern University

Kodipunjula was one of nine individuals who participated in a study led by Northeastern computer sciences professor Timothy Bickmore that was evaluating the emerging technology and how it can be used to conduct patient pain assessments. 

Bickmore has spent much of his career studying how technology can be used to facilitate better medical outcomes for patients. Virtual agents, in particular, have great promise since they can be used to speak with patients one on one both through verbal communication and through the use of hand gestures and other nonverbal behaviors, he says. 

“Think of them like ChatGPT, except they have an embodiment,” says Bickmore in describing the technology. “If they’re on the screen, it’s an animated character, and the idea is that we’re simulating face-to-face conversations. We’re working in areas where building trust and rapport with users is important over time.” 

These agents are particularly helpful to communicate with individuals who have low health literacy and computer literacy, Bickmore says. 

“We find that having an embodiment makes these systems much more approachable and acceptable to people who aren’t used to typing at a computer screen or typing on their phone with text prompts.” 

For this particular study, the team placed a focus on evaluating how the agents could be used specifically to conduct pain assessments. It’s the first of many studies being supported by the National Cancer Institute, Bickmore says, which awarded Bickmore and his team the funding to help develop better methods to collect data about how cancer patients are doing between chemotherapy and radiation treatments.

“We wanted to start with pain because there’s literature going back 50 years observing when patients tell stories about their pain, they tend to use a lot of hand gestures,” he says. “They’ll point to where the pain occurs. They’ll use hand shapes to describe the pain. The conversational agent can use the hand gestures and reflect back to the patient what it understood.” 

Participants in the study were broken up into two groups. The first group was provided with a female animated agent that provided direct feedback and elicited empathy when participants described their pain. The second group spoke to a female animated agent that simply took in their information without providing any direct feedback. 

Participants in the first group reported the robot displaying both empathy and sympathy. And while the robot certainly sounded artificial, participants for the most part felt their concerns were heard. 

Participants in the second group were less pleased with the agent, noting that while she took in their information, they would have appreciated it if she had repeated their symptoms back to them or provided some kind of feedback at all. 

The study also backed up previous literature that found that patients use various verbal and nonverbal behaviors to describe their pain. To track their movements, the team collected data using a variety of tools, including microphones, cameras and a Microsoft Kinect. 

“They used hand gestures of all different kinds,” he says. “That’s promising to us because now we can start looking at building tools to recognize those hand gestures to do a better job of assessing individual reports of pain.”

The agents themselves, while advanced, are still in the proof-of-concept phase, Bickmore notes. For this particular study, a researcher was in the other room listening to what the patients reported as symptoms and then fed that data into the AI chatbot itself.

“This was a Wizard of Oz system, meaning a research confederate was listening to the participant and then mapping that to what they thought they understood and what the agent should do next,” he says.

More work needs to be done to enable the chatbots to be able to process those nonverbal actions on its own, he says. 

The team has a working smartphone app that can be used to assess a patient’s pain based on their verbal speech. It can’t, however, pick up on nonverbal behaviors quite yet. 

“We have an app that is in development that we are giving to cancer patients,” he says. “But it doesn’t currently do multimodal assessments, just language-based assessments. In the future, we’d like to be able to combine that information from the smartphone camera to get at some of this nonverbal behavior and integrate it.”  

Cesareo Contreras is a Northeastern Global News reporter. Email him at c.contreras@northeastern.edu. Follow him on X/Twitter @cesareo_r and Threads @cesareor.