Claim Missing Document
Check
Articles

Found 3 Documents
Search

Deep ensemble architectures with heterogeneous approach for an efficient content-based image retrieval Asokaraj, Manimegalai; Kumar, Josephine Prem; Ashwin, Nanda
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 13, No 4: December 2024
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v13.i4.pp4843-4855

Abstract

In the field of digital image processing, content-based image retrieval (CBIR) has become essential for searching images based on visual content characteristics like color, shape, and texture, rather than relying on text-based annotations. To address the increasing demands for efficiency and precision in CBIR systems, we introduce the HybridEnsembleNet methodology. HybridEnsembleNet combines deep learning algorithms with an asymmetric retrieval framework to optimize feature extraction and comparison in extensive image databases. This novel approach, specifically custom-made for CBIR, employs a lightweight query structure skilled at handling large-scale data under resource-constrained environments. The experiments were performed on the ROxford and RParis datasets. The deep learning component of HybridEnsembleNet significantly refines the accuracy of image matching and retrieval. RParis The ROxford dataset, specifically in the medium and hard difficulty benchmarks, demonstrates an enhancement of 5.53% and 10.44%, respectively. Similarly, the RParis dataset, under medium and hard benchmarks, exhibits improvements of 3.01% and 5.83%, showcasing superior performance compared to existing models. By overcoming the traditional limitations of CBIR systems in mean average precision (mAP) metrics, HybridEnsembleNet provides a scalable, efficient, and more accurate solution for retrieving relevant images from vast digital libraries.
A novel scalable deep ensemble learning framework for big data classification via MapReduce integration Varadharajan, Kesavan Mettur; Prem Kumar, Josephine; Ashwin, Nanda
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 2: April 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i2.pp1386-1400

Abstract

Big data classification involves the systematic sorting and analysis of extensive datasets that are aggregated from a variety of sources. These datasets may include but are not limited to, electronic records, digital imaging, genetic information sequences, transactional data, research outputs, and data streams from wearable technologies and connected devices. This paper introduces the scalable deep ensemble learning framework for big data classification (SDELF-BDC), a novel methodology tailored for the classification of large-scale data. At its core, SDELF-BDC leverages a Hadoop-based map-reduce framework for feature selection, significantly reducing feature-length and enhancing computational efficiency. The methodology is further augmented by a deep ensemble model that judiciously applies a variety of deep learning classifiers based on data characteristics, thereby ensuring optimal performance. Each classifier's output undergoes a rigorous optimization-based ensemble approach for refinement, utilizing a sophisticated algorithm. The result is a robust classification system that excels in predictive accuracy while maintaining scalability and responsiveness to the dynamic requirements of big data environments. Through a strategic combination of classifiers and an innovative reduction phase, SDELF-BDC emerges as a comprehensive solution for big data classification challenges, setting new benchmarks for predictive analytics in diverse and data-intensive domains.
HBRFE: an enhanced recursive feature elimination model for big data classification Varadharajan, Kesavan Mettur; Kumar, Josephine Prem; Ashwin, Nanda
Bulletin of Electrical Engineering and Informatics Vol 14, No 4: August 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i4.9595

Abstract

The process of classification in big data is a tedious task due to the large number of volumes, veracity, and variety of the data. Classification of big data pave the path to organize the data and improve the classifier performance. This research article proposed a Hadoop framework based recursive feature elimination-based model called HBFRE for extract significant features from the big data by integrating map and reduce frame work. HBFRE extract the significant features by removing the least and irrelevant features from the dataset by using refined recursive feature elimination (RFE) with map and reduce framework. This method takes the mean of each attribute and find the variance in each instance. The proposed model is evaluated and analyzed by the accuracy performance and time complexity. This research utilized various classifier like artificial neural network (ANN), support vector machine (SVM), random forest (RF), k-nearest neighbors (KNN), and AdaBoost to measure the classification performance on the big data. Proposed HBRFE model is compared with different feature selection like RFE, relief, backwards feature elimination, maximum relevance k-nearest neighbors (MR-KNN), and scalable deep ensemble framework big data classification (SDELF-BDC).