Building of Informatics, Technology and Science
Vol 5 No 1 (2023): June 2023

Eksperimen Layer Pooling menggunakan Standar Deviasi untuk Klasifikasi Dataset Citra Wajah dengan Metode CNN

Pratama, Yovi (Unknown)
Rasywir, Errissya (Unknown)
Fachruddin, Fachruddin (Unknown)
Kisbianty, Desi (Unknown)
Irawan, Beni (Unknown)



Article Info

Publish Date
29 Jun 2023

Abstract

Deep Learning, especially the Convolutional Neural Network (CNN) has proven to be reliable in processing data from various programming language platforms by utilizing deep learning. In this study, we modified it by calculating the statistical variance. The modifications made are replacing calculations on the Pooling Layer which generally use two formulas, namely max pooling and average pooling. We use the standard deviation to change the reduced image intensity value. With the research experiments built, it is expected to be able to perform facial recognition as an indicator for testing modifications. The Layer Pooling experiment uses the Standard Deviation for Classifying Face Image Datasets with the CNN Method, including the type of dataset used is the Aberdeen dataset https://pics.stir.ac.uk/2D_face_sets.htm. From the results of the experiments conducted, it was found that the highest value was using the Elu activation function and the Adagrad optimizer worth 77.844% for max pooling and 79.844% for pooling with a standard deviation. The Cellu activation function and the RMSprop optimizer are 77.986% for max pooling and 75.986% for pooling with a standard deviation. The highest score with the Softplus activation function and the Sgd optimizer is 77.844% for max pooling usage and 76.344% for pooling with standard deviation. The Tanh activation function and the Adadelta optimizer are 87.844% for max pooling and 85.844% for pooling with a standard deviation. The Elu activation function and the Adam optimizer are 87.853% for the use of max pooling and 85.285% for pooling with a standard deviation. By using the Elu activation function and the Adamax optimizer, the value is 87.842% for max pooling and 86.242% for pooling with a standard deviation. The highest score is using the Elu activation function and the Nadam optimizer with a value of 87.845% for max pooling usage and 86.345% for using standard deviation calculations as pixel pooling. From all experiments it was stated that the use of pooling with the highest value technique or max pooling still gave a better value than using the standard deviation calculation with the best tuning results using the Elu activation function and Adam's Optimiser, which was 87.853%.

Copyrights © 2023






Journal Info

Abbrev

bits

Publisher

Subject

Computer Science & IT

Description

Building of Informatics, Technology and Science (BITS) is an open access media in publishing scientific articles that contain the results of research in information technology and computers. Paper that enters this journal will be checked for plagiarism and peer-rewiew first to maintain its quality. ...