Human-computer interaction continues to evolve, one of which is through device control using hand gestures. In the context of presentations, the use of a mouse as an assistive tool often limits the presenter’s mobility. Therefore, this study aims to develop a hand gesture detection system to control the cursor in real time using the OpenCV and MediaPipe libraries. The system is designed to allow presenters to move freely in front of the audience without being tied to physical input devices. The system development follows the Rational Unified Process (RUP) methodology, encompassing four phases: Inception, Elaboration, Construction, and Transition. The system is implemented using the Python programming language and the Autopy library for cursor control. Testing was conducted under lighting conditions of 100 lux and at a distance of approximately 1 to 1.5 meters. The test results demonstrate excellent system performance: single-click functionality achieved 98% accuracy and 98% precision; double-click functionality reached 99% accuracy and 100% precision; right-click functionality showed 98.04% accuracy and 96.15% precision; and cursor movement achieved 100% accuracy and 100% precision. The system is capable of detecting hand gestures and controlling the cursor with high accuracy and fast response. It also supports light multitasking activities such as opening various types of files. This research contributes to the advancement of human-computer interaction without the need for traditional input devices.
Copyrights © 2025