Putra Sadewa, Fastabyq
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Application of ADASYN and Optuna in the XGBoost Algorithm for Stunting Detection Putra Sadewa, Fastabyq; Kurniawan, Defri
Journal of Applied Informatics and Computing Vol. 10 No. 1 (2026): February 2026
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v10i1.12035

Abstract

This study aims to develop an early detection model for childhood stunting risk using a machine learning approach based on Extreme Gradient Boosting (XGBoost), integrated with the Adaptive Synthetic Sampling (ADASYN) technique for data balancing and Optuna-based hyperparameter optimization. One of the main challenges in stunting prediction is class imbalance, where the number of stunting cases is significantly higher than non-stunting cases, thereby reducing the model’s ability to accurately identify the minority class. To address this issue, the study implements data deduplication, structured data splitting, and applies ADASYN exclusively to the training data to prevent data leakage and preserve the validity of the evaluation process. The proposed model (XGBoost with ADASYN and Optuna) is then compared with a baseline model that combines XGBoost and SMOTE. Experimental results show that the proposed model achieves an accuracy of 81.98%, a recall of 91.50%, and an F1-score of 89.14%, indicating improved sensitivity and a more balanced classification performance compared to the baseline. These findings demonstrate that the integration of ADASYN and Optuna-based hyperparameter optimization enhances model stability and generalization capability, making it a viable data-driven approach for stunting risk detection in environments with imbalanced class distributions.