This study addresses the challenges of selecting a suitable visual tracking method for real-time mobile robot applications, particularly in scenarios where the target is moving on the ground. The primary research problem addressed is the need for a flexible, computationally efficient tracking method that does not rely on pre-existing labelled datasets, as is often required by deep learning approaches. Unsupervised methods can overcome this problem by utilizing object motion information in each image frame without prior training. With many unsupervised tracking methods available, choosing an appropriate algorithm that can perform efficiently under dynamic conditions becomes a critical problem. The study compares the performance of three unsupervised visual tracking methods: particle filter, optical flow, and channel and spatial reliability tracker (CSRT) under various tracking conditions. The dataset used includes challenges such as moving target variations, changes in object scale, viewpoint changes, suboptimal lighting, image blurring, partial occlusions, and abrupt movements. Evaluation criteria include tracking accuracy, resistance to occlusion, and computational efficiency. The particle filter with ORB and a constant velocity model achieves a root mean square error (RMSE) of 36.47 pixels at 13 frames per second (fps). Optical flow performs best with an RMSE of 10.79 pixels at 30 fps, while CSRT shows an RMSE of 252.35 pixels at 4 fps. These findings highlight the effectiveness of optical flow for real-time applications, making it a promising solution for mobile robot visual tracking in challenging situations.
                        
                        
                        
                        
                            
                                Copyrights © 2025