Multilayer Perceptron (MLP) is a powerful machine learning algorithm capable of modeling complex, non-linear relationships, making it suitable for predicting car purchasing power. However, its performance depends on hyperparameter tuning and data quality. This study optimizes MLP performance using GridSearch and Optuna for hyperparameter tuning while addressing data imbalance with the Synthetic Minority Over-sampling Technique (SMOTE). The dataset comprises demographic and financial attributes influencing car purchasing power. Initially, the dataset exhibited class imbalance, which could lead to biased predictions; SMOTE was applied to generate synthetic samples, ensuring a balanced class distribution. Two hyperparameter tuning approaches were implemented: GridSearch, which systematically explores a predefined parameter grid, and Optuna, an adaptive optimization framework utilizing a Bayesian approach. The results show that Optuna achieved the highest accuracy of 95.00% using the Adam optimizer, whereas GridSearch obtained the best accuracy of 94.17% with the RMSProp optimizer, demonstrating Optuna's superior ability to identify optimal hyperparameters. Additionally, SMOTE significantly improved model stability and predictive performance by ensuring adequate class representation. These findings offer insights into best practices for optimizing MLP in predictive modeling. The combination of SMOTE and advanced hyperparameter tuning techniques is applicable to various domains requiring accurate predictive analytics, such as finance, healthcare, and marketing. Future research can explore alternative optimization algorithms and data augmentation techniques to further enhance model robustness and accuracy.
Copyrights © 2025