Masaugi, Fathan Fanrita
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Deep Learning Menggunakan Algoritma Xception dan Augmentasi Flip Pada Klasifikasi Kematangan Sawit Masaugi, Fathan Fanrita; Yanto, Febi; Budianita, Elvia; Sanjaya, Suwanto; Syafria, Fadhilah
KLIK: Kajian Ilmiah Informatika dan Komputer Vol. 4 No. 6 (2024): Juni 2024
Publisher : STMIK Budi Darma

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30865/klik.v4i6.1938

Abstract

Palm oil is an important commodity in Indonesia, especially as Indonesia is the highest palm oil exporting country in the world. Ripe palm fruit is marked by a change in color of the fruit from black to reddish yellow. Apart from that, immature palm fruit has a negative and significant effect on CPO production. The data collection process was carried out by directly taking pictures of palm fruit on oil palm plantations and data obtained from Kaggle. The total amount of data is 1000 images and 1000 data resulting from flip augmentation. The Xception algorithm is an algorithm in deep learning which stands for Extreme version of Inception. This combination was then proven to provide better accuracy in classifying images from a dataset. The optimizer used is the optimizer in TensorFlow, namely Adam (Adaptive Moment Estimation) using learning rate and dropout values. Images of mature and immature palm oil were classified using the Xception algorithm with augmented and without augmented data. In addition, experiments were carried out by changing the parameter values ??of learning rate to 0.1, 0.01, 0.001 and dropout to 0.1, 0.01, 0.001. It was found that the data division was (90;10) with the best accuracy reaching 95%. Test parameters carried out by trialling were proven to increase accuracy when compared to without using parameters and flip augmentation. The best accuracy of the Xception model is 95% on augmented data with a learning rate of 0.001 and a dropout of 0.1.