Sign language is used by individuals with disabilities, particularly the deaf and those with speech impairments, as their primary means of communication. However, interaction between people with disabilities and the general public is often hampered by a lack of understanding of sign language. This study aims to develop an artificial intelligence-based application capable of detecting and classifying hand movements in Indonesian Sign Language (BISINDO) using the YOLOv8 algorithm. The YOLOv8 algorithm was chosen for its ability to detect and classify objects in real-time with high accuracy, even under varying lighting and background conditions. This is one of the first studies to implement YOLOv8 for real-time BISINDO detection integrated with a web interface. The dataset used includes 51 classes of hand movements with a total of 10,822 images that have undergone augmentation to increase data diversity. The development process involved data collection, pre-processing, annotation, model training, and integration with an interactive web interface. The resulting model demonstrated high performance, achieving mAP@50 of 96%, mAP@50-95 of 70%, and classification accuracy of 93.8% in the final evaluation. This application is intended to help the deaf community communicate more easily with the wider community. It can improve communication accessibility for individuals with hearing impairments in public and educational settings, as well as provide an innovative solution to support social inclusivity. Further testing and parameter optimization will be conducted to expand the detection coverage and improve the system's performance in the future.
Copyrights © 2025