This research presents a novel, real-time Pakistani Sign Language (PSL) recognition sys-tem utilizing a custom-designed sensory glove integrated with advanced machine learn-ing techniques. The system aims to bridge communication gaps for individuals with hearing and speech impairments by translating hand gestures into readable text. At the core of this work is a smart glove engineered with five resistive flex sensors for precise finger flexion detection and a 9-DOF Inertial Measurement Unit (IMU) for capturing hand orientation and movement. The glove is powered by a compact microcontroller, which processes the analog and digital sensor inputs and transmits the data wirelessly to a host computer. A rechargeable 3.7 V Li-Po battery ensures portability, while a dynamic dataset comprising both static alphabet gestures and dynamic PSL phrases was recorded using this setup. The collected data was used to train two models: a Support Vector Machine with feature extraction (SVM-FE) and a Long Short-Term Memory (LSTM) deep learning network. The LSTM model outperformed traditional methods, achieving an accuracy of 98.6% in real-time gesture recognition. The proposed system demonstrates robust perfor-mance and offers practical applications in smart home interfaces, virtual and augmented reality, gaming, and assistive technologies. By combining ergonomic hardware with intel-ligent algorithms, this research takes a significant step toward inclusive communication and more natural human-machine interaction.
Previous Article in event
Previous Article in session
Next Article in event
A Smart Glove-Based System for Dynamic Sign Language Translation Using LSTM Networks
Published:
07 November 2025
by MDPI
in The 12th International Electronic Conference on Sensors and Applications
session Sensors and Artificial Intelligence
https://doi.org/10.3390/ECSA-12-26530
(registering DOI)
Abstract:
Keywords: Sensory Glove ; Hand Gesture Recognition ; Machine Learning ;Deep Learning ;Long Short-Term Memory (LSTM)
