Difficulty in communication is an obstacle for deaf friends who cannot learn the language orally or acquire normal speech skills. The development of sign language gesture recognition technology is an important step to improve accessibility and social integration for the deaf community. The use of MediaPipe Holistic Keypoints and deep learning techniques provides significant potential in recognizing and understanding sign language gestures. The main objective of this study is to classify Indonesian Sign Language (BISINDO) gestures using MediaPipe Holistic Keypoints and a deep learning approach to identify basic words in sign language. By extracting features using mediapipe holistic and sending them to the LSTM 6 hidden layer model with 70:30 split train test and 250 epochs, an accuracy of 68% was produced. This is due to the limited number of datasets taken for the study.
Copyrights © 2024