Accurate weed detection is essential for maintaining the cleanliness and aesthetic appeal of residential yards. This study aimed to optimize YOLOv11n, a lightweight object detection model, to achieve high precision in weed identification under real-world conditions. The novelty of this study lies in the application of Optuna, an automatic hyperparameter optimization framework, to enhance model performance while maintaining computational efficiency for resource-limited devices such as drones and IoT systems. The research involved data augmentation techniques including crop (0–20% zoom), hue (±20°), saturation (±30%), brightness (±20%), exposure (±15%), and mosaic augmentation. These augmented images were used to train four YOLO nano variants (v5n, v8n, v11n, v12n), which were evaluated using standard metrics: Precision, Recall, F1-Score, and mean Average Precision (mAP). Among the models tested, YOLOv11n with Custom Optuna configuration delivered the highest performance, achieving a 94.6% F1-score and 97.8% mAP@0.5. These results demonstrate that the optimized YOLOv11n model can support accurate and efficient real-time weed detection in household environments, particularly on edge devices with limited hardware capabilities. This makes it a viable solution for practical implementation in precision agriculture and smart gardening.
Copyrights © 2025