Humans as social beings who always want to be in touch with other human’s force humans to communicate with each other. This is where the role of language becomes very important, because with language, it will be easy to understand what other people want to convey. For this reason, it is necessary to have a media that can help to understand the various languages in the world, one of which is machine translation. One method that can be used to make machine translators is Neural Machine Translation (NMT). The existing NMT still has various shortcomings and needs to be further developed. Among them is the problem of overfitting which makes the model less able to generalize to other data being tested. Many things affect the performance of the NMT, one of which is the size of the hyperparameters used and the model architecture used. However, there is no definite measure that can be used to produce a model with the best performance. So, this study aims to develop the NMT model architecture and perform simulations on each Neural Network hyperparameter and the size of the model architecture, including batch size, epoch, optimizer, activation function, and dropout rate. The results obtained are the development model can overcome the overfitting problem of the previous model with an accuracy of 72.24% and a BLEU score of 45.83% which was carried out on other test data.