Human communication generally relied on speech. However, this was not applicable to the deaf people, who depended on sign language for daily interactions. Unfortunately, not everyone had the ability to understand sign language. In higher education environments, the lack of individuals proficient in sign language often created inequality in the learning process for deaf students. This limitation could be addressed by fostering a more inclusive environment, one of which was through the implementation of a sign language translation system. Therefore, this study aimed to develop a machine learning model capable of detecting and translating Indonesian Sign Language (BISINDO) alphabet gestures. The model was built using the Xception transfer learning method from Convolutional Neural Networks (CNN). The dataset consisted of 26 BISINDO alphabet gestures with a total of 650 images. The model was evaluated using K-Fold cross-validation and achieved an F1-score of 94% during testing.
Copyrights © 2025