Individuals with severe motor impairments face challenges in performing daily manipulation tasks independently. Existing assistive robotic systems show limited accuracy (typically 85–92%) and low intuitive control, requiring extensive training. This study presents a control system integrating electrooculography (EOG) signals with real-time computer vision feedback for natural, high-precision control of a 4-degrees-of-freedom (4-DOF) robotic manipulator in assistive applications. The system uses an optimized K-Nearest Neighbors (KNN) algorithm to classify six eye-movement categories with computational efficiency and real-time performance. Computer-vision modules map object coordinates and provide feedback integrated with inverse kinematics for positioning. Validation with 10 able-bodied participants (aged 18–22) employed standardized protocols under controlled laboratory conditions. The KNN classifier achieved 98.17% accuracy, 98.47% true-positive and 1.53% false-negative rates. Distance-measurement error averaged 1.5 mm (± 1.6 mm). Inverse-kinematics positioning attained sub-millimeter precision with 0.64 mm mean absolute error (MAE) for frontal retrieval and 1.58 mm for overhead retrieval. Operational success rates reached 99.48% for frontal and 97.96% for top-down retrieval tasks. The system successfully completed object detection, retrieval, transport, and placement across ten locations. These findings indicate a significant advancement in EOG-based assistive robotics, achieving higher accuracy than conventional systems while maintaining intuitive user control. The integration shows promising potential for rehabilitation centers and assistive environments, though further validation under diverse conditions, including latency and fatigue, is needed.