This research was conducted to study the Back Propagation ANN that was implemented with the Matlab GUI. Where the data used is the duration of shine data, the network architecture is formed by determining the number of units per layer. After the network is formed, training and testing are carried out from the data that has been grouped before. Furthermore, the prediction stage uses the trainrp training method with the logsig activation function of each layer. The number of input layer neurons is 120, the first hidden layer is 10, the second hidden layer is 10, the output layer is 1. While the maximum epoch parameter is 1000, the goal is 0.001, the learning rate is 0.7 and step is 1. Based on the simulation results January was 51.25%, February was 62.00%, March was 59.29%, April was 64.52%, May was 71.42%, June was 79.32%, July was 64.25%, August was 77.87%, September was 85.02%, October was 81.33%, November of 56.67%, and December of 39.14%. The simulation results were obtained with MSE accuracy of 5.114, MAD of 1.479, MAPE of 2.162, RMSE of 2.261, and accuracy of 97.43%.
Copyrights © 2018