Shalih, Muhammad Umar
Unknown Affiliation

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Hand Sign Virtual Reality Data Processing Using Padding Technique Tju, Teja Endra Eng; Anggraini, Julaiha Probo; Shalih, Muhammad Umar
Intelligent System and Computation Vol 6 No 2 (2024): INSYST: Journal of Intelligent System and Computation
Publisher : Institut Sains dan Teknologi Terpadu Surabaya (d/h Sekolah Tinggi Teknik Surabaya)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52985/insyst.v6i2.395

Abstract

This study focuses on addressing the challenges of processing hand sign data in Virtual Reality environments, particularly the variability in data length during gesture recording. To optimize machine learning models for gesture recognition, various padding techniques were implemented. The data was gathered using the Meta Quest 2 device, consisting of 1,000 samples representing 10 American Sign Language hand sign movements. The research applied different padding techniques, including pre- and post-zero padding as well as replication padding, to standardize sequence lengths. Long Short-Term Memory networks were utilized for modeling, with the data split into 80% for training and 20% for validation. An additional 100 unseen samples were used for testing. Among the techniques, pre-replication padding produced the best results in terms of accuracy, precision, recall, and F1 score on the test dataset. Both pre- and post-zero padding also demonstrated strong performance but were outperformed by replication padding. This study highlights the importance of padding techniques in optimizing the accuracy and generalizability of machine learning models for hand sign recognition in Virtual Reality. The findings offer valuable insights for developing more robust and efficient gesture recognition systems in interactive Virtual Reality environments, enhancing user experiences and system reliability. Future work could explore extending these techniques to other Virtual Reality interactions.
Hand Sign Interpretation through Virtual Reality Data Processing Tju, Teja Endra Eng; Shalih, Muhammad Umar
Jurnal Ilmu Komputer dan Informasi Vol. 17 No. 2 (2024): Jurnal Ilmu Komputer dan Informasi (Journal of Computer Science and Informatio
Publisher : Faculty of Computer Science - Universitas Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.21609/jiki.v17i2.1280

Abstract

The research lays the groundwork for further advancements in VR technology, aiming to develop devices capable of interpreting sign language into speech via intelligent systems. The uniqueness of this study lies in utilizing the Meta Quest 2 VR device to gather primary hand sign data, subsequently classified using Machine Learning techniques to evaluate the device's proficiency in interpreting hand signs. The initial stages emphasized collecting hand sign data from VR devices and processing the data to comprehend sign patterns and characteristics effectively. 1021 data points, comprising ten distinct hand sign gestures, were collected using a simple application developed with Unity Editor. Each data contained 14 parameters from both hands, ensuring alignment with the headset to prevent hand movements from affecting body rotation and accurately reflecting the user's facing direction. The data processing involved padding techniques to standardize varied data lengths resulting from diverse recording periods. The Interpretation Algorithm Development involved Recurrent Neural Networks tailored to data characteristics. Evaluation metrics encompassed Accuracy, Validation Accuracy, Loss, Validation Loss, and Confusion Matrix. Over 15 epochs, validation accuracy notably stabilized at 0.9951, showcasing consistent performance on unseen data. The implications of this research serve as a foundation for further studies in the development of VR devices or other wearable gadgets that can function as sign language interpreters.