This study explores the optimization of the Extreme Gradient Boosting (XGBoost) algorithm for credit card default prediction through systematic hyperparameter tuning using Grid Search and Random Search methodologies. Utilizing the publicly available Default of Credit Card Clients dataset from the UCI Machine Learning Repository, the research focuses on enhancing model performance by fine-tuning critical parameters such as learning rate, maximum tree depth, number of estimators, subsample ratio, and column sampling rate. The baseline XGBoost model achieved an accuracy of 0.8118, while the tuned models using Grid Search and Random Search improved the accuracy to 0.8183 and 0.8188, respectively. Although the improvement appears modest, the optimized models exhibited enhanced balance between precision and recall, particularly in identifying defaulters within an imbalanced dataset—an essential aspect in credit risk assessment. The results demonstrate that systematic hyperparameter optimization not only improves predictive performance but also contributes to model stability and generalization. Moreover, Random Search proved to be more computationally efficient, achieving near-optimal performance with fewer evaluations than Grid Search, thereby emphasizing its practicality for large-scale financial risk modeling applications. The novelty of this study lies in the comparative evaluation of two optimization techniques within the context of financial risk prediction, providing practical insights into how efficient hyperparameter tuning can enhance the reliability and scalability of machine learning models used in real-world credit risk management systems.
Copyrights © 2025