People with speech and hearing disabilities have difficulties when communicating with non-disabled people because they use sign language which is rarely learned in general. To solve this problem, a deep learning model is needed that can detect sign language hand gestures which can then be made a sign language translator application that can facilitate communication between non-disabled people and people with disabilities. The purpose of this research is to create a deep learning model that is able to detect SIBI alphabet type sign language hand gestures with good accuracy. The CNN algorithm model uses Transfer Learning MobilenetV2 architecture and transfer learning method. The results of this study show that the model evaluation reaches 95.45% and the next model can be applied to the sign translator application, for further development it is expected to use more datasets so that the model gets a lot of variation during the training process.
Copyrights © 2025