American Sign Language (ASL) serves as the predominant language of deaf communities in the United States and Canada, helping approximately 500,000 people communicate through facial expressions and hand motions. Unfortunately, most of the hearing population does not learn ASL, which creates comprehension and translation issues when interacting with deaf individuals. This knowledge gap can lead to misunderstandings and frustration for all parties, especially within family units.
Researchers at The University of Alabama have developed a system that utilizes radar to recognize hand gestures and a 3D camera to interpret other forms of nonverbal communication such as facial expressions. The collected data is then translated into text, symbols, and audio, aiding in better communication between deaf and hearing populations. The radar portion of the system can operate without the need for additional equipment such as lighting or signing gloves, which allows more privacy and reduces logistical hassle. The applications of this novel invention are far-reaching in education and healthcare, strengthening the interactions between deaf and hearing individuals.