Rosa Anamisa, Devie
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Classification of Rice Disease Using Deep Learning Object Detection Yolov8 Dwi Satoto, Budi; Rosa Anamisa, Devie; Yusuf, Muhammad; Kautsar Sophan, Mohammad; Kembang Hapsari, Rinci; Irmawati, Budi; Arrova Dewi, Deshinta
JOIV : International Journal on Informatics Visualization Vol 9, No 6 (2025)
Publisher : Society of Visual Informatics

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.62527/joiv.9.6.3578

Abstract

Rice plant pests and diseases are among the primary threats to agricultural production, particularly in rice-growing regions, which can result in a significant decrease in crop yields and food production. Therefore, technology is essential for accurately detecting and classifying pests and diseases. In this research, the author proposes using deep learning-based object detection for moving objects. This is because observations are made on relatively large land areas. Images are captured by drone cameras as videos, which are then used to create ground-truth markers and identification targets during training. YOLO v8 is the latest object detection model on moving media. This model offers advantages in speed and accuracy, making it well-suited for applications that require precise results on agricultural land. The dataset comprises videos of rice plants infested with pests and diseases. After completing labeling and training, the YOLO v8 model can detect and classify pests and diseases in real time using markers in the form of frames with identification labels. Farmers can identify pest and disease attacks earlier by implementing this system, enabling more effective, timely pest control measures. The study's results showed that the training accuracy was 91.5%. The F1-Confidence measurement value obtained was 0.84, the Precision-Recall Curve was 0.891, and the Recall Confidence Curve was 0.97. The trial results, based on experimental data, achieved confidence accuracy of 80% to 95%.