Syaifuddin, Angga Exca Pradipta
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Squeeze-excitation half U-Net and synthetic minority oversampling technique oversampling for papilledema image classification Wiharto, Wiharto; Syaifuddin, Angga Exca Pradipta
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 2: April 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i2.pp1410-1419

Abstract

The emergence of various convolutional neural networks (CNN) architectures indicates progress in the computer vision field. However, most of the architectures have large parameters, which tends to increase the computational cost of the training process. Additionaly, imbalanced data sources are often encountered, causing the model to overfit. The aim of this study is to evaluate a new method to classify retinal fundus images from imbalanced data into the corresponding classes by using fewer parameters than the previous method. To achieve this, squeeze-excitation half U-Net (SEHUNET) architecture, a modification of half U-Net with squeeze-excite process to provide attention mechanism on each feature maps channel of the model, in combination with synthetic minority oversampling technique (SMOTE) is proposed. The test accuracy of SEHUNET is 98.52% with area under the curve of receiver operation characteristic (AUROC) of 0.999. This result outperforms the previous study that used CNN with Bayesian optimization, achieving accuracy of 95.89% and AUROC of 0.992. SEHUNET is also able to compete with the transfer learning methods used in previous research such as InceptionV3 with 96.35% accuracy, visual geometry group (VGG) with 96.8%, and ResNet with 98.63%. This performance can be achieved by SEHUNET with only 0.268 million parameters compared to the architecture parameters used in previous research ranging from 11 million to 33 million.