Could a smart device catch implicit bias in the workplace?

Northeastern researchers are embarking on a project to yield an Alexa-like device that could be used in professional settings to alert users to instances of implicit bias. Photo by Matthew Modoono/Northeastern University

Studies have shown that implicit bias—the automatic, and often unintentional, associations people have in their minds about groups of people—is ubiquitous in the workplace, and can hurt not just employees, but also a company’s bottom line. 

For example, employees who perceive bias are nearly three times as likely to be disengaged at work, and the cost of disengagement to employers isn’t cheap—to the tune of $450 billion to $550 billion a year. Despite the growing adoption of implicit bias training, some in the field of human resources have raised doubts about its effectiveness in improving diversity and inclusion within organizations.

But what if a smart device, similar to the Amazon Alexa, could tell when your boss inadvertently left a female colleague out of an important decision, or made her feel that her perspective wasn’t valued? 

Christoph Rieldl is an associate professor with joint appointments in the Khoury College of Computer Sciences and the D’Amore McKim School of Business. Brooke Foucault Welles is an associate professor of communication studies in the College of Arts, Media and Design. Photos by Adam Glanzman and Ruby Wallau/Northeastern University

This device doesn’t yet exist, but Northeastern associate professors Christoph Riedl and Brooke Foucault Welles are preparing to embark on a three-year project that could yield such a gadget. The researchers will be studying from a social science perspective how teams communicate with each other as well as with smart devices while solving problems together. 

“The vision that we have [for this project] is that you would have a device, maybe something like Amazon Alexa, that sits on the table and observes the human team members while they are working on a problem, and supports them in various ways,” says Riedl, an associate professor who studies crowdsourcing, open innovation, and network science. “One of the ways in which we think we can support that team is by ensuring equal inclusion of all team members.”

The pair have received a $1.5 million, three-year grant from the U.S. Army Research Laboratory to study teams using a combination of social science theories, machine learning, and audio-visual and physiological sensors.

Welles says the grant—which she and Riedl will undertake in collaboration with research colleagues from Columbia University, Rensselaer Polytechnic Institute, and the Army Research Lab—will allow her and her colleagues to program a sensor-equipped, smart device to pick up on both verbal and nonverbal cues, and eventually physiological signals, shared between members of a team. The device would keep track of their interactions over time, and then based on those interactions, make recommendations for improving the team’s productivity.

“You could imagine [a scenario] where maybe a manager at the end of a group deliberation gets a report that says person A was really dominating the conversation,” says Welles. The smart device would alert the manager to the participants whose input might have been excluded, she says, with a reminder to follow up with that individual. 

As a woman, Welles says she knows all too well how it feels to be excluded in a professional setting.

“When you’re having this experience, it’s really hard as the woman in the room to intervene and be like, ‘you’re not listening to me,’ or ‘I said that and he repeated it and now suddenly we believe it,’” she says. “I really love the idea of building a system that both empowers women with evidence that this is happening so that we can feel validated and also helps us point out opportunities for intervention.” 

Addressing these issues as soon as they occur could help cultivate a culture where all employees feel included, suggests Riedl.

“We know from the research that we and others have done that teams that exhibit a more equal distribution of speaking time tend to do better,” he says. “And so if we can build AI technology that helps the team function better, then that should increase team performance.”  

The researchers believe their work could have useful applications in companies, nonprofits, universities, and other professional settings that require individuals to collaborate in order to make decisions or solve problems, says Riedl. 

There are some challenges that the researchers expect to face. For one, says Welles, it’s anyone’s guess how knowledge of the presence of such a device in a room will affect how its occupants interact with one another. Another unpredictability, she says, is how subjects will respond to any errors that the device makes.

Riedl points out that a compounding challenge is that team dynamics are still not fully understood, which will make it challenging to determine how a smart system can help rather than hinder a team of people. But within this challenge, he sees an opportunity for using artificial intelligence to help people in group settings work more efficiently.

“What I find super interesting about this project is that it is both using machine learning to study teams and then at the same time using AI to intervene on the team and make the team better,” he says.

The project will build upon earlier research Welles and Riedl conducted with Richard Radke from Rensselaer Polytechnic Institute in which they designed a sensor-embedded smart conference room where they tested the effect of lighting on small groups.

For media inquiries, please contact media@northeastern.edu