Sreekala, Keshetti
Unknown Affiliation

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Plant disease classification using novel integration of deep learning CNN and graph convolutional networks Maheswara Rao, Saka Uma; Sreekala, Keshetti; Rao, Pulluri Srinivas; Shirisha, Nalla; Srinivas, Gunnam; Sreedevi, Erry
Indonesian Journal of Electrical Engineering and Computer Science Vol 36, No 3: December 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v36.i3.pp1721-1730

Abstract

Plant diseases present substantial challenges to global agriculture, significantly affecting crop yields and jeopardizing food security. Accurate and timely detection of these diseases is paramount for mitigating their adverse effects. This paper proposes a novel approach for plant disease classification by integrating convolutional neural networks (CNNs) and graph convolutional networks (GCNs). The model aims to enhance classification accuracy by leveraging both visual features extracted by CNNs and relational information captured by GCNs. Using a Kaggle dataset containing images of diseased and healthy plant leaves from 31 classes, including apple, corn, grape, peach, pepper bell, potato, strawberry, and tomato. Standalone CNN models were trained on image data from each plant type, while standalone GCN models utilized graph-structured data representing plant relationships within each subset. The proposed integrated CNN-GCN model capitalizes on the complementary strengths of CNNs and GCNs to achieve improved classification performance. Through rigorous experimentation and comparative analysis, the effectiveness of the integrated CNN-GCN approach was evaluated alongside standalone CNN and GCN models across all plant types. Results demonstrated the superiority of the integrated model, highlighting its potential for enhancing plant disease classification accuracy.
A hybrid convolutional neural network-recurrent neural network approach for breast cancer detection through Mask R-CNN and ARI-TFMOA optimization Sreekala, Keshetti; Yalamati, Srilatha; Lakshmanarao, Annemneedi; Kumari, Gubbala; Kumari, Tanapaneni Muni; Desanamukula, Venkata Subbaiah
International Journal of Electrical and Computer Engineering (IJECE) Vol 15, No 3: June 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v15i3.pp3084-3094

Abstract

This paper presents a novel hybrid deep learning-based approach for breast cancer detection, addressing critical challenges such as overfitting and performance degradation in varying data conditions. Unlike traditional methods that struggle with detection accuracy, this work integrates a unique combination of advanced segmentation and classification techniques. The segmentation phase leverages Mask region-based convolutional neural network (R-CNN), enhanced by the adaptive random increment-based tomtit flock metaheuristic optimization algorithm (ARI-TFMOA), a novel algorithm inspired by natural flocking behavior. ARI-TFMOA fine-tunes Mask R-CNN parameters, achieving improved feature extraction and segmentation precision while ensuring adaptability to diverse datasets. For classification, a hybrid convolutional neural network-recurrent neural network (CNN-RNN) model is introduced, combining spatial feature extraction by CNNs with temporal pattern recognition by RNNs, resulting in a more nuanced and comprehensive analysis of breast cancer images. The proposed framework achieved significant advancements over existing methods, demonstrating improved performance. This hybrid integration of ARI-TFMOA and Hybrid CNN-RNN models represents a unique contribution, enabling robust, accurate, and efficient breast cancer detection.