Claim Missing Document
Check
Articles

Found 2 Documents
Search

Electric wheelchair navigation based on hand gestures prediction using the k-Nearest Neighbor method Anam, Khairul; Nahela, Safri; Sasono, Muchamad Arif Hana; Rizal, Naufal Ainur; Putra, Aviq Nurdiansyah; Wahono, Bambang; Putrasari, Yanuandri; Wardana, Muhammad Khristamto Aditya; Salim, Taufik Ibnu
Journal of Mechatronics, Electrical Power, and Vehicular Technology Vol 16, No 1 (2025)
Publisher : National Research and Innovation Agency

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.55981/j.mev.2025.1229

Abstract

The advancement of technology in the medical field has led to innovations in assistive devices, including wheelchairs, to enhance the mobility and independence of individuals with disabilities. This study investigates the use of electromyography (EMG) signals from hand muscles to control a wheelchair using the k-Nearest Neighbor (kNN) classification method. kNN is a classification algorithm that identifies objects based on the proximity of similar objects in the feature space. The wheelchair control process begins with the development of a kNN model trained on EMG signal data collected from five respondents over 30 seconds. The data was processed using feature extraction techniques, namely Mean Absolute Value (MAV) and Root Mean Square (RMS), to identify motion characteristics corresponding to five types of movement: forward, backward, right, left, and stop. The extracted features were classified using the kNN algorithm implemented on a Raspberry Pi 3. The classification results were then used to control the wheelchair through an Arduino UNO microcontroller connected to a BTS7960 motor driver. The study achieved an average accuracy of 96% with the MAV feature and ? = 3. Furthermore, combining MAV and RMS features significantly improved classification accuracy. The highest accuracy was obtained using the combination of MAV and RMS features with ? = 3, demonstrating the effectiveness of feature selection and parameter tuning in enhancing the system's performance.
Myoelectric grip force prediction using deep learning for hand robot Anam, Khairul; Ardhiansyah, Dheny Dwi; Hana Sasono, Muchamad Arif; Nanda Imron, Arizal Mujibtamala; Rizal, Naufal Ainur; Ramadhan, Mochamad Edoward; Muttaqin, Aris Zainul; Castellini, Claudio; Sumardi, Sumardi
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 4: August 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i4.pp3228-3240

Abstract

Artificial intelligence (AI) has been widely applied in the medical world. One such application is a hand-driven robot based on user intention prediction. The purpose of this research is to control the grip strength of a robot based on the user’s intention by predicting the grip strength of the user using deep learning and electromyographic signals. The grip strength of the target hand is obtained from a handgrip dynamometer paired with electromyographic signals as training data. We evaluated a convolutional neural network (CNN) with two different architectures. The input to CNN was the root mean square (RMS) and mean absolute value (MAV). The grip strength of the hand dynamometer was used as a reference value for a low-level controller for the robotic hand. The experimental results show that CNN succeeded in predicting hand grip strength and controlling grip strength with a root mean square error (RMSE) of 2.35 N using the RMS feature. A comparison with a state-of-the-art regression method also shows that a CNN can better predict the grip strength.