This study presents an innovative approach to enhance apple leaf disease detection using deep learning by comparing three models: ReXNet-150, EfficientNet, and Conventional CNN (ResNet-18). The objective is to identify the most accurate and efficient model for real-world deployment in resource-constrained environments. Utilizing a dataset of 1,730 high-quality images, the models were trained using transfer learning, achieving significant results. ReXNet-150 outperformed other models with an F1-score of 0.988, precision of 0.989, and recall of 0.989. EfficientNet and ResNet-18 demonstrated competitive performances with F1-scores of 0.966 and 0.977, respectively. The integration of the ReXNet-150 model into a TensorFlow Lite-based Android application ensures real-time detection, enabling farmers and researchers to capture or upload images for immediate classification. The findings highlight ReXNet-150's robustness, achieving a test accuracy of 98.9% and minimal misclassification, making it ideal for practical agricultural applications. The novelty lies in bridging advanced deep learning with mobile deployment, addressing real-world constraints. Future work could extend this framework to multi-crop disease detection and real-time video analysis, providing scalable solutions for precision agriculture.
Copyrights © 2025