This study aims to optimize the backpropagation algorithm by evaluating various activation functions to improve the accuracy of inflation rate predictions. Utilizing historical inflation data, neural network models were constructed and trained with Sigmoid, ReLU, and TanH activation functions. Evaluation using the Mean Squared Error (MSE) metric revealed that the ReLU function provided the most significant performance improvement. The findings indicate that the choice of activation function and neural network architecture significantly influences the model's ability to predict inflation rates. In the 5-7-1 architecture, the Logsig and ReLU activation functions demonstrated the best performance, with Logsig achieving the lowest MSE (0.00923089) and the highest accuracy (75%) on the test data. These results underscore the importance of selecting appropriate activation functions to enhance prediction accuracy, with ReLU outperforming the other functions in the context of the dataset used. This research concludes that optimizing activation functions in backpropagation is a crucial step in developing more accurate inflation prediction models, contributing significantly to neural network literature and practical economic applications.
Copyrights © 2024