This study proposes an EMG-based hand gesture recognition method for adaptive prosthesis applications using interpretable deep learning. The proposed method utilizes EMG (electromyography) signals obtained from arm muscles to identify various hand movements performed by prosthesis users. By using a deep learning architecture, the developed model can classify hand movements with high accuracy. This approach also integrates model interpretability through saliency map visualization techniques, which allows understanding of the key features used by the network to make decisions. EMG datasets collected from several subjects were trained to recognize hand gestures such as gripping, grasping, and waving, and were complemented with signal processing to reduce noise and improve data quality. Evaluation results show that the proposed deep learning model achieves classification accuracy of up to 95%, with a relatively low time-to-decision, making it suitable for prosthesis applications that require fast and accurate responses. The results of this study have the potential to improve prosthesis performance with smoother and more responsive control, as well as provide new insights for the development of biomedical signal-based prosthetic devices.
Copyrights © 2024