Skip to content

Survey finds skepticism of sign language tech among Deaf and Hard-of-Hearing community

There are more than 100 registered sign languages throughout the world.

Saki Imai sitting in front of a laptop signing with one hand. Behind her is a purple and white geometric background.
Saki Imai, a postdoctoral computer science student at Northeastern, surveyed the Deaf and Hard-of-Hearing community about sign language technology. Photo by Alyssa Stone/Northeastern University

Sign-language technology promises to “make your content available to millions” by using artificial intelligence to translate videos or even audio announcements into sign language. 

Many members of the Deaf and Hard-of-Hearing (DHH) community are skeptical, according to a new survey from Northeastern University that polled respondents from around the world.

“Access to these ‘technologies’—which are generally not useful and created as vanity projects by hearing dilettantes with no understanding of the deaf community—will make hearing people even less willing to accommodate language needs, thinking (wrongly) ‘there’s an app for that,”’ one survey respondent wrote. 

That feedback is not off-base, the researchers behind the survey said. 

“Historically, developers and researchers have been building technology, such as sign language recognition, translation and generation tools, without any collaboration or getting insights from the Deaf community,” said Saki Imai, a PhD student at Northeastern who led the research. 

“​​We want [this community] to decide what is success – not just a number or an accuracy rate,” added Malihe Alikhani, assistant professor at Northeastern who was also part of the study. 

Worldwide, more than 1.5 billion people live with hearing loss, and 430 million live with severe hearing loss, according to the World Health Organization.

Problems arise when sign language is treated as a monolith, however.. There are more than 100 registered sign languages throughout the world, and much like English or any other spoken language, there are also dialects and variants and the language evolves, the researchers added.

For example, Black American Sign Language uses facial expressions to augment and/or change the meaning of certain signs. If someone makes the sign for “lawyer” while puffing out his or her cheeks, however, the sign means “crazy,” Alikhani said.

While artificial intelligence has allowed some software to recognize sign language, translate to and from sign language and can generate sign language, that technology is only as good as the data used to train it.

“Just because something works based on preliminary data we collected a few years ago or the needs that we heard a few years ago, does it mean that in schools today, with this population, with this language variability, with these needs, it’s going to work? Alikhani asked. 

Alikhani added that a lot of sign language technology was originally developed primarily to improve computers’ ability to recognize the boundaries of human fingers when they were doing different tasks.

“This was a challenge for the computer vision community within AI and computer science,” Alikhani said. “It was really excluded from the needs of the DHH community.”

Imai said researchers can also lose sight of the implications of their work during a product’s development.

“A lot of researchers, once they get started on a project, tend to care about the technicality and accuracy and how well it does on a given task and kind of forget about whether the purpose is aligned with the community that you want to serve and societal impact that your technology could have,” Imai said.

The results of the survey reflected what happens when programmers aren’t sensitive to sign language’s nuances, the researchers found. The standardization of signs used by the software led to international concerns that signs reflecting local nuance and cultures might be ignored and thereby erased, according to Alikhani 
On the other hand, respondents with limited access to translation services were more optimistic about the technology, according to the research.

“They provide independence, autonomy, and critical support in situations where human interpreters are not available,” one respondent said.

Rachel Berman-Kobylarz, a principal lecturer in the American Sign Language and Interpreting Education program at Northeastern, did not participate in the survey effort but said that, as a Deaf person, she uses technologies such as automated captioning such as Otter AI transcription software for immediate communication in everyday situations.

Berman-Kobylarz said she was “cautiously optimistic” of sign-language technology provided it is produced by or with Deaf people. She said she would only support companies that are Deaf-led and center Deaf experiences.

“We as a community have reasonable expectations about what sign language technology can and should do and how it’s used,” Berman-Kobylarz said. “It should be used as a tool, not a solution to replace real-life interpreters.”

She added that more important situations  – for instance, in a medical or legal setting – required human interpreters, and replacing them would be problematic. This was a concern shared by survey respondents as well. 

Many respondents worried that technology could be deployed as a cheaper substitute to human interpreters. Respondents from the United States, especially, worried about privacy issues, researchers found.

Lori Whynot, the director of the Northeastern ASL and Interpreting Education program, shared much of the skepticism of survey respondents. 

“There are lots of questions, there are lots of possibilities – which is always an exciting thing – but there’s a lot of skepticism,” Whynot, who also was not involved in the study and who is not deaf but is a hearing, professional ASL-English interpreter, said. “And as we see historically, the people that would most benefit from these technologies aren’t the ones asking the questions from the very start…They’re always the end user.”

So, what is a solution?

For one, we shouldn’t “fall into the trap” of thinking that AI can solve everything, Alikhani said.

“Technology cannot offer just one solution,” she said.

Importantly, Imai said the survey demonstrates the need for early and continuous collaboration among the DHH community and the hearing community – in research, testing, and all other stages of the product development process.

Berman-Kobylarz and Whynot agreed.

“If Deaf people are defining the problem and are being part of the solution, then I think it’s beneficial,” Whynot said. “But you’ve got to gain entry and trust to be able to work with a community that’s a small, tight-knit, culturally rich community with its own language and traditions and values. That has to be part of the equation.”