Artificial Neural Networks (ANN) with the Backpropagation algorithm have been widely applied across various domains, including data prediction tasks. However, one of the primary challenges in implementing Backpropagation is the selection of an optimal learning rate. A learning rate that is too high can lead to unstable convergence, while one that is too low can significantly slow down the training process. To address this issue, this study proposes an optimization of Backpropagation using an Adaptive Learning Rate through the implementation of the Adam optimizer. The objective of this research is to analyze the performance comparison between Standard Backpropagation and Backpropagation with the Adam optimizer in predicting rice harvest yields based on rainfall, temperature, and humidity variables. The dataset consists of 100 synthetic samples generated based on a normal distribution to resemble real-world data. The results show that the use of the Adam optimizer improves the performance of the ANN model compared to the Standard Backpropagation method. Model accuracy increased from 92.04% to 92.99%, while the values of loss, Mean Squared Error (MSE), and Root Mean Squared Error (RMSE) decreased significantly, indicating that the model optimized with Adam is more stable and yields lower prediction errors. Therefore, Adaptive Learning Rate optimization using the Adam optimizer is proven to be effective in enhancing both the accuracy and efficiency of ANN in data prediction tasks.
Copyrights © 2025