Design and Implementation of Multilingual Sign Language Recognition System

Student: Ajibola Ebenezer Sanya (Project, 2025)
Department of Computer and Information Science
Bamidele Olumilua University of Edu. Science and Tech. Ikere Ekiti, Ekiti State


Abstract

The Multilingual Sign Language Recognition System addresses
communication barriers faced by the hearing and speech-impaired community,
especially in multilingual contexts. By leveraging advanced deep learning
techniques, the YOLO algorithm for real-time gesture detection and TensorFlow
for classification, this system focuses on recognizing hand gestures across multiple
sign languages such as ASL and BSL. The study achieved a detection accuracy of
99%. Despite limitations like dependency on high-performance hardware and
exclusion of facial expressions, the project demonstrates significant potential as
an assistive technology. Recommendations include expanding dataset diversity,
integrating additional recognition capabilities, and optimizing for edge devices to
enhance accessibility and scalability.

Keywords
design implementation multilingual language recognition system