A woman sit in a chair using sign languagein front of a bank of electronic equipment and computers.

Signing In

UA Research Aims to Improve Interactive Technologies for Deaf Community

Smart devices activated with voice commands are a growing segment of consumer technology, but they can be challenging to access by those who use American Sign Language as their primary language.

Saying “Ok, Google,” or “Alexa” to start verbally interacting with virtual assistants that work with Google or Amazon systems, for instance, may not be user friendly for those in deaf and hard of hearing communities.

Researchers across disciplines at The University of Alabama are developing an innovative way for deaf and hard-of-hearing communities to better interact with devices or smart environments. The patent-pending technology leverages radio frequency sensing to enable a human-computer interface built from the start to recognize the language of the deaf community — sign language.

The UA team developed radio detection and ranging, or radar, and artificial intelligence algorithms for a miniature RF sensor as a promising way to advance technologies built for sign language. Initial work showed the concept, and the next phase is partnering with the deaf community to make it more robust and user driven.

“Deaf and hard-of-hearing communities are marginalized for their minority language status,” said Dr. Darrin Griffin, who studies interpersonal communication at UA. “They are generally not brought to the table for the access and inclusion they need. We are taking an approach to involve the deaf community when building these interfaces.”

Dr. Sevgi Zubeyde Gurbuz, who researches the design of next-generation AI-enhanced radars, leads the work to develop sign language recognition using radar and machine learning. The radar transceivers used are low-cost, low-power, small sensors designed for biomedical and automotive radar applications, which have a lower output power than cell phones

“Our focus is not really on translation of sign language, the traditional focus of American Sign Language research,” she said. “Instead, we are focused on how we can pave the way for more technology sensitive to ASL, and thus, more accessible to the deaf community.”

Experience Gap

Three people talk in front of a bank of electronic equipment and computers.
From left, student Caroline Yuk, Dr. Darrin Griffin and Dr. Sevgi Zubeyde Gurbuz discuss their project to improve computer interaction for the deaf and hard-of-hearing communities.

As humans increasingly interact with technology beyond typing letters on a computer keyboard, there is an experience gap for hearing and deaf and hard-of-hearing communities. Advancements in AI have opened gateways for new technologies ranging from devices such as wrist watches, to robots and even smart environments.

Current ways for the deaf community to interact with technology without using a keyboard involve wearable sensors, such as gloves, and video.

Gloves that detect hand motion have uses in limited situations, but wearing a glove to activate a smart device or computer assistant is a non-starter for the deaf community, Griffin said.

Video has advantages, particularly when communicating with another person through a computer such as a video call, but is limited in its field of view, need for good lighting, and potential confusion by skin tone and masking by clothes. Also, for uses such as a home smart device, a video camera would need to remain on to watch for communication, like how these devices constantly listen for verbal commands now, raising concerns over potential surveillance.

Gurbuz said radar offers advantages over other sensors because it can work in any lighting, from any angle, and is not sensitive to skin tone, or what the user is wearing. Moreover, radar is less invasive of personal spaces because it does not capture the face or environment.

“It can function in the background without affecting daily life at all,” she said.

Radar has drawbacks, though. It doesn’t capture facial expressions, a critical component of communication through sign language, nor can currently available commercial RF sensors capture the shapes a hand makes while signing. While advancements in RF hardware may improve to enable some hand-shape recognition in the future, even with current capabilities the UA team has shown hand dynamics alone can offer the potential for enabling radar-based human-computer interaction.

“We trade off facial and shape recognition for privacy,” Gurbuz said. “We only need to recognize ASL to the extent that control and interfacing with technology can be accomplished.”

Unique Signatures

Radar uses radio waves to measure the characteristics of movement as given by distance and velocity. Hands moving for sign language can be visualized through profiles of velocity versus time, also known as micro-Doppler signatures. Gurbuz and her students have shown these signatures are unique for sign language and can be used to discriminate signing from other types of human activities, as well as identify individual signs.

“We’ve shown there’s definitely information to be gotten from the sensor and a wide range of signs that can be recognized,” she said.

“We are focused on how we can pave the way for more technology sensitive to ASL, and thus, more accessible to the deaf community.”

The team filed a patent to cover the use of a radar and data analysis algorithms that interpret sign language. While there have been uses of radar with hand gestures, UA’s effort is pioneering the development of radar-based sign language recognition.

So far, the computer recognizes about 100 common words in ASL. As with audio speech, the computer needs more examples to learn the signs and the differences in their signatures.

“We’re teaching a language to a baby, essentially,” Griffin said. “We are starting with the fundamentals because if we want this to be sign language friendly, we have to train the computer.”

An RV-like bus parks on the UA campus.
Alabama Institute for Deaf and Blind STEM Bus came to campus in April as part of a developing partnership with UA researchers.

Partnerships

With proof of concept, UA researchers are partnering with the Alabama Institute for Deaf and Blind, a world-class education, rehabilitation and service program serving individuals of all ages who are deaf, blind, deaf-blind and multi-disabled.

“We want to integrate the deaf community at all levels of this project,” Gurbuz said.

Besides more data, the computer needs more quality data to get better at sign recognition. Gurbuz, who is learning ASL, could train the computer, but her signing is not the same as those fluent in the language. Plus, like spoken language, people can add small nuances to sign language that becomes a dialect, making it difficult for a few people to recreate in a lab setting for the computer.

Also, a smart device or computer assistant needs trigger words to activate, and a technology designed for the deaf community would be no different. Students at AIDB can help the researchers develop appropriate trigger signs to activate devices.

The UA team is developing RF-sensor based interactive devices that AIDB students can use not only to provide data, but also learn more about computer interfaces and programming through an interactive, educational activity built for use on the AIDB Stem Bus.

“We are not trying to develop technology for them,” Griffin said. “We want to work together with the deaf community to develop technology they want to use.”

This work is supported by a grant from the National Science Foundation.

Griffin is an associate professor of communication studies. Gurbuz is an assistant professor of electrical and computer engineering. Other team members include Dr. Chris Crawford, assistant professor of computer science; Dr. Evie Malaia, associate professor of communicative disorders; and Dr. Ali Cafer Gurbuz, assistant professor of electrical and computer engineering at Mississippi State University.

Contact

Adam Jones, UA communications, 205-348-4328, adam.jones@ua.edu