The advancement of computer vision has created new possibilities for human-computer interaction, especially in digital games. This study develops a hand gesture-based control system for Subway Surfers using computer vision. The system enables players to control the game character through hand gestures captured by a camera, eliminating the need for conventional input devices like keyboards or touchscreens. The methodology involves capturing gesture data, processing images to extract gesture features, implementing a recognition model using MediaPipe Hands and Convolutional Neural Network (CNN), and integrating the system with the game via input control emulation. Testing evaluates gesture recognition accuracy, system responsiveness, and user experience. Results show the system achieves an average accuracy of 85% under stable lighting and a response time of 100–150 ms, which is acceptable for real-time gameplay. These findings indicate that hand gesture-based controls using computer vision are feasible and effective for controlling Subway Surfers. This research contributes to the development of more immersive human-computer interaction, particularly in gaming and interactive applications.
Copyrights © 2025