In this research, a head gesture-controlled robot was designed and developed to assist individuals with disabilities in performing tasks by translating head movements into robot commands. Using an accelerometer sensor embedded in a headgear device, the system interprets specific gestures—such as forward nods for forward movement, backward nods for reversing, and lateral tilts for turning left or right—into corresponding robotic actions. The design involved constructing a mechanical framework for the robot, assembling the headgear, and integrating both with Arduino-based programming to ensure accurate and responsive movements. Testing was conducted in a controlled setting, where the robot consistently followed head gestures with a high degree of accuracy, showing rapid response times to user inputs. Quantitative results demonstrated the system’s reliability, with over 95% accuracy in gesture recognition and minimal latency. This innovative system underscores the potential of head gesture-controlled robotics in assistive technology, offering an affordable, user-friendly solution to enhance mobility and autonomy for individuals with limited physical capabilities
Copyrights © 2024