Real-Time American Sign Language Translation Through Wearable AI Technology
Current assistive technologies for sign language translation are limited in availability, accessibility, or real-time performance. Research in wearable interfaces, particularly those involving sensor-equipped gloves, has shown promise in recognizing hand gestures and converting them to text.
Advances in embedded AI and edge computing have made it feasible to run lightweight models directly on wearable devices, reducing latency and the need for constant connectivity. While some prototypes exist, most systems rely on external processing or large datasets, making them less portable or scalable.
This project builds on these foundations by integrating motion sensors, flex sensors, and an onboard AI model into a self-contained glove, aiming to offer a more practical and real-time solution for ASL translation.
We've decided to design a smartwatch-like technology, embedded within a wearable glove, that can interpret American Sign Language through built-in sensors and onboard processing. By integrating compact components such as flex sensors, motion detectors, and a lightweight AI model, the system will read hand gestures and convert them into audible English.
Our vision is to create an accessible, portable, and real-time communication tool that bridges the gap between sign language users and the wider community.
Unlike existing solutions that require external processing, KYRA features onboard AI processing for true real-time translation without the need for constant internet connectivity.
Acts as the central processing unit of the glove. It runs the AI model, processes sensor data, and communicates with the computer. Its built-in Wi-Fi and Bluetooth enable fast, wireless data transfer from the glove to external devices.
Supplies portable power to the glove device, enabling wireless operation. It ensures consistent energy delivery to the sensors, microcontroller, and onboard AI model for real-time gesture translation.
Detects hand orientation and movement in real time. It provides accelerometer and gyroscope data to help the glove accurately interpret dynamic ASL gestures. This motion data is crucial for understanding signs involving wrist and hand motion.
Measure the bending of each finger to detect specific hand shapes used in ASL. As the fingers move, the sensor's resistance changes, providing data to the AI model. These readings help identify static signs based on finger positioning.
Manages safe recharging of the 3.7V LiPo battery inside the glove. It regulates voltage and current to protect the battery from overcharging. This ensures reliable, long-term power for continuous device operation.
Encases the glove's electronic components and secures them comfortably around the user's wrist. The smartwatch-like design keeps components stable and ensures consistent sensor placement for accurate gesture detection.
Exploded view showing the complete KYRA system: ESP32 development board, MPU6050 accelerometer, 3.7V LiPo battery, charging circuit, wearable housing, wrist straps, and five flex sensors for finger tracking
Design and assemble the glove using flex sensors, motion sensors, and a microcontroller. Ensure accurate signal capture from hand and finger movements.
Train and deploy a lightweight machine learning model capable of recognizing ASL gestures based on sensor input. Optimize it for real-time processing on the onboard system.
Translate recognized gestures into text and convert them into speech using a connected computer. Begin testing for accuracy, latency, and usability in real-world conditions.
Current prototype assembly
The KYRA project represents a significant step forward in assistive technology for the deaf and hard-of-hearing community. Future development will focus on:
Want to learn more about assistive technology development or discuss potential collaborations?