Inclusive education for deaf students requires a technology approach to address communication and comprehension challenges. This study aims to develop innovative learning media that integrates real-time ASR (Automated Speech Recognition) transcription technology and sign language character animation to improve accessibility and comprehension of materials for deaf students. This learning media receives input from live voice, voice from learning videos, and text inputted by teachers. Using the Google Cloud API-based ASR transcription module, voice and text are converted into written text, broken down into vocabulary for sign language animation search. The search is carried out using an interpolation algorithm in the sign language animation asset database, allowing the display of animations relevant to the transcribed vocabulary. The development process follows the ADDIE instructional design model, starting with needs analysis and ending with implementation and evaluation. The analysis stage includes data collection through teacher interviews, classroom observations, and curriculum reviews. The media design is designed to meet the specific needs of deaf students, while development and implementation focus on technology integration and effective material delivery. Evaluation is carried out to assess the effectiveness of the media in improving student understanding and participation. The study's results showed that this learning media can improve deaf students' understanding of the material and increase their involvement in the learning process. ASR technology and sign language animation contribute significantly to making learning materials more accessible and understandable, supporting the goals of inclusive education.
Copyrights © 2024