Ku-Mahamud, Ku Ruhana
Unknown Affiliation

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Feature selection for sky image classification based on self adaptive ant colony system algorithm Petwan, Montha; Ku-Mahamud, Ku Ruhana
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 6: December 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i6.pp7037-7047

Abstract

Statistical-based feature extraction has been typically used to purpose obtaining the important features from the sky image for cloud classification. These features come up with many kinds of noise, redundant and irrelevant features which can influence the classification accuracy and be time consuming. Thus, this paper proposed a new feature selection algorithm to distinguish significant features from the extracted features using an ant colony system (ACS). The informative features are extracted from the sky images using a Gaussian smoothness standard deviation, and then represented in a directed graph. In feature selection phase, the self-adaptive ACS (SAACS) algorithm has been improved by enhancing the exploration mechanism to select only the significant features. Support vector machine, kernel support vector machine, multilayer perceptron, random forest, k-nearest neighbor, and decision tree were used to evaluate the algorithms. Four datasets are used to test the proposed model: Kiel, Singapore whole-sky imaging categories, MGC Diagnostics Corporation, and greatest common divisor. The SAACS algorithm is compared with six bio-inspired benchmark feature selection algorithms. The SAACS algorithm achieved classification accuracy of 95.64% that is superior to all the benchmark feature selection algorithms. Additionally, the Friedman test and Mann-Whitney U test are employed to statistically evaluate the efficiency of the proposed algorithms.
Deep residual bidirectional long short-term memory fusion: achieving superior accuracy in facial emotion recognition Munsarif, Muhammad; Ku-Mahamud, Ku Ruhana
Bulletin of Electrical Engineering and Informatics Vol 14, No 3: June 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i3.9090

Abstract

Facial emotion recognition (FER) is a crucial task in human communication. Various face emotion recognition models were introduced but often struggle with generalization across different datasets and handling subtle variations in expressions. This study aims to develop the deep residual bidirectional long short-term memory (Bi-LSTM) fusion method to improve FER accuracy. This method combines the strengths of convolutional neural networks (CNN) for spatial feature extraction and Bi-LSTM for capturing temporal dynamics, using residual layers to address the vanishing gradient problem. Testing was performed on three face emotion datasets, and a comparison was made with seventeen models. The results show perfect accuracy on the extended Cohn-Kanade (CK+) and the real-world affective faces database (RAF-DB) datasets and almost perfect accuracy on the face expression recognition plus (FERPlus) dataset. However, the receiver operating characteristic (ROC) curve for the CK+ dataset shows some inconsistencies, indicating potential overfitting. In contrast, the ROC curves for the RAF-DB and FERPlus datasets are consistent with the high accuracy achieved. The proposed method has proven highly efficient and reliable in classifying various facial expressions, making it a robust solution for FER applications.