This study proposes a deep learning model based on MobileNetV2 architecture for the classification of chili leaf diseases using image data. The dataset was compiled from both public and private sources, covering six distinct categories of chili leaf conditions. MobileNetV2 was selected due to its efficiency and accuracy, making it ideal for real-time agricultural applications. The model was enhanced with additional layers to improve feature extraction and classification performance. Stratified 10-fold cross-validation was employed to ensure balanced evaluation across an imbalanced dataset. The experimental results showed an overall accuracy of 91.04% and an average F1-score of 0.906, indicating consistent and reliable classification performance across classes. Confusion matrix analysis highlighted strong predictive capability, particularly in detecting healthy leaves and severe disease symptoms, with minor misclassifications among visually similar categories. The findings confirm the potential of lightweight CNN architectures for practical, mobile-based agricultural diagnostics, contributing to advancements in precision farming and early disease management.