The development of Artificial Intelligence, particularly in Computer Vision, has enabled real-time recognition of human movements such as head gestures, which can be utilized in smart wheelchairs for users with limited mobility. This study compares two lightweight non-deep-learning methods Lucas–Kanade Optical Flow and Haar Cascade Classifier for real-time head movement detection. Both methods were implemented in Python using OpenCV and tested in four basic directions (left, right, up, and down) under three different lighting conditions: bright, normal, and dim. Each condition consisted of 16 trials per method, resulting in a total of 96 trials. The evaluation focused on detection accuracy and decision time. Under bright lighting, Optical Flow achieved 87.5% accuracy with a decision time of 0.338-1.41 s, while Haar Cascade reached 50% accuracy with 0.616–1.20 s. Under normal lighting, Optical Flow maintained 87.5% accuracy with 0.89–1.21 s, compared to Haar Cascade’s 68.75% accuracy with 0.83–1.25 s. Under dim lighting, Optical Flow improved to 93.8% accuracy with 0.90–1.31 s, whereas Haar Cascade dropped to 62.5% accuracy with 0.89–1.58 s. These findings confirm that Optical Flow delivers more reliable and adaptive performance across varying illumination levels, making it more suitable for real-time smart wheelchair control. This study contributes to the development of affordable assistive technologies and highlights future directions for multi-user testing and hardware integration.