Sign language serves as an essential medium of communication for individuals with hearing impairments. However, limited public understanding continues to create barriers between deaf individuals and the broader community. This study aims to develop an AI-based mobile application capable of recognizing and translating Indonesian Sign Language (BISINDO) gestures in real time. The system integrates the YOLOv8 model for gesture detection within a mobile environment. The YOLOv8n model was trained using the BISINDO dataset and evaluated through precision, recall, and mean average precision (mAP) metrics. The model obtained a mAP@50 of 98% under the given experimental settings. These results indicate that the YOLOv8 architecture can be applied to mobile-based real-time gesture recognition and may serve as a foundation for future research on assistive communication technologies.
Copyrights © 2025