AI/ML
Wearable Tech
Ongoing

KYRA ASL Translation Glove

Real-Time American Sign Language Translation Through Wearable AI Technology

May 2025 - Ongoing
AI-Powered Wearable
KYRA ASL Translation Glove - Internal Hardware Components
Project Introduction

Current assistive technologies for sign language translation are limited in availability, accessibility, or real-time performance. Research in wearable interfaces, particularly those involving sensor-equipped gloves, has shown promise in recognizing hand gestures and converting them to text.

Advances in embedded AI and edge computing have made it feasible to run lightweight models directly on wearable devices, reducing latency and the need for constant connectivity. While some prototypes exist, most systems rely on external processing or large datasets, making them less portable or scalable.

This project builds on these foundations by integrating motion sensors, flex sensors, and an onboard AI model into a self-contained glove, aiming to offer a more practical and real-time solution for ASL translation.

Project Vision

We've decided to design a smartwatch-like technology, embedded within a wearable glove, that can interpret American Sign Language through built-in sensors and onboard processing. By integrating compact components such as flex sensors, motion detectors, and a lightweight AI model, the system will read hand gestures and convert them into audible English.

Our vision is to create an accessible, portable, and real-time communication tool that bridges the gap between sign language users and the wider community.

Key Innovation

Unlike existing solutions that require external processing, KYRA features onboard AI processing for true real-time translation without the need for constant internet connectivity.

System Components

ESP32 WROOM-32

Acts as the central processing unit of the glove. It runs the AI model, processes sensor data, and communicates with the computer. Its built-in Wi-Fi and Bluetooth enable fast, wireless data transfer from the glove to external devices.

3.7V LiPo Battery Pack

Supplies portable power to the glove device, enabling wireless operation. It ensures consistent energy delivery to the sensors, microcontroller, and onboard AI model for real-time gesture translation.

MPU6050 Accelerometer

Detects hand orientation and movement in real time. It provides accelerometer and gyroscope data to help the glove accurately interpret dynamic ASL gestures. This motion data is crucial for understanding signs involving wrist and hand motion.

Flex Sensors

Measure the bending of each finger to detect specific hand shapes used in ASL. As the fingers move, the sensor's resistance changes, providing data to the AI model. These readings help identify static signs based on finger positioning.

Charging Circuit

Manages safe recharging of the 3.7V LiPo battery inside the glove. It regulates voltage and current to protect the battery from overcharging. This ensures reliable, long-term power for continuous device operation.

Housing & Wrist Straps

Encases the glove's electronic components and secures them comfortably around the user's wrist. The smartwatch-like design keeps components stable and ensures consistent sensor placement for accurate gesture detection.

System Architecture

KYRA System Exploded View - Complete Component Breakdown

Exploded view showing the complete KYRA system: ESP32 development board, MPU6050 accelerometer, 3.7V LiPo battery, charging circuit, wearable housing, wrist straps, and five flex sensors for finger tracking

Development Methodology
01

Hardware Design & Assembly

Design and assemble the glove using flex sensors, motion sensors, and a microcontroller. Ensure accurate signal capture from hand and finger movements.

02

AI Model Development

Train and deploy a lightweight machine learning model capable of recognizing ASL gestures based on sensor input. Optimize it for real-time processing on the onboard system.

03

Translation & Testing

Translate recognized gestures into text and convert them into speech using a connected computer. Begin testing for accuracy, latency, and usability in real-world conditions.

Current Progress & Prototype

Development Status

Hardware Assembly: Prototype completed
Sensor Calibration: In progress
AI Model Training: Data collection phase
Real-time Translation: Planned for next phase
KYRA Prototype

Current prototype assembly

Technical Achievements

  • • Successfully integrated ESP32 with multiple sensor inputs
  • • Achieved stable wireless communication between glove and computer
  • • Implemented real-time sensor data collection and processing
  • • Designed compact, wearable form factor with smartwatch-like aesthetics
  • • Established reliable power management system for extended operation
Future Development

The KYRA project represents a significant step forward in assistive technology for the deaf and hard-of-hearing community. Future development will focus on:

  • Enhanced AI Model: Expanding gesture recognition to include more complex ASL vocabulary and grammar structures
  • Bilateral Support: Developing a two-glove system for complete ASL translation including two-handed signs
  • Mobile Integration: Creating a companion mobile app for personalized settings and translation history
  • Commercial Viability: Optimizing design for mass production and exploring partnerships with assistive technology companies
Project Details
Start Date
May 2025
Type
Personal Project
Status
In Development
Category
Assistive Technology
Technologies Used
ESP32 WROOM-32
MPU6050 Accelerometer
Flex Sensors
Machine Learning
Embedded AI
Wearable Electronics
Key Specifications
3.7V
Battery Voltage
5
Flex Sensors
Real-time
Processing
Interested in KYRA?

Want to learn more about assistive technology development or discuss potential collaborations?