This research develops an object detection system to assist visually impaired individuals in navigating dynamic environments, including roads and indoor spaces. The system employs YOLO version 10 (YOLOv10) with dual cameras and provides audio output through a speaker. Using the Research and Development (R&D) method, the system detects six object classes—person, car, motorcycle, bicycle, table, and chair—in real-time. Testing was conducted with variations in distance, lighting conditions, delay, and direct trials with visually impaired users. Results show an effective detection range of up to 5 meters. Under bright indoor lighting, the average error was 8.97%, while outdoor morning conditions yielded 3.95%. In low-light and dark conditions, accuracy decreased significantly, with errors ranging from 60.33% to 100%. Detection delay ranged from 4.3 to 7.4 seconds. The system achieved a Macro F1-Score of 0.74, with the highest performance for cars (0.92) and the lowest for persons (0.62). Direct trials with five visually impaired participants showed an average accuracy of 92.58% and delays around 4.63 seconds. The system effectively delivers precise audio information, helping users recognize objects in front and behind, thereby enhancing safety and confidence during navigation.