INOVTEK Polbeng - Seri Informatika
Vol. 10 No. 3 (2025): November

Evaluation of the Effect Of Regularization on Neural Networks for Regression Prediction: A Case Study of MLLP, CNN, and FNN Models

Susandri (Unknown)
Zamsuri, Ahmad (Unknown)
Nasution, Nurliana (Unknown)
Ramadhani, Maya (Unknown)



Article Info

Publish Date
15 Nov 2025

Abstract

Regularization is an important technique for developing deep learning models to improve generalization and reduce overfitting. This study evaluated the effect of regularization on the performance of neural network models in regression prediction tasks using earthquake data. We compare Multilayer Perceptron (MLP), Convolutional Neural Network (CNN), and Feedforward Neural Network (FNN) architectures with L2 and Dropout regularization. The experimental results show that MLP without regularization achieved the best performance (RMSE: 0.500, MAE: 0.380, R²: 0.625), although prone to overfitting. CNN performed poorly on tabular data, while FNN showed marginal improvement with deeper layers. The novelty of this study lies in a comparative evaluation of regularization strategies across multiple architectures for earthquake regression prediction, highlighting practical implications for early warning systems.

Copyrights © 2025






Journal Info

Abbrev

ISI

Publisher

Subject

Computer Science & IT

Description

The Journal of Innovation and Technology (INOVTEK Polbeng—Seri Informatika) is a distinguished publication hosted by the State Polytechnic of Bengkalis. Dedicated to advancing the field of informatics, this scientific research journal serves as a vital platform for academics, researchers, and ...