This research focuses on in-depth exploration and analysis of the application of two types of Recurrent Neural Network (RNN), namely Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). The two models are drilled with the same parameters, consist of three layer, use the relu activation function, and apply 1 dropout level. In order to compare the performance of the two, experiments were carried out using five groups of datasets for training and performance evaluation purposes. The evaluation includes metrics such as accuracy, recall, F1-score, and area under the curve (AUC). The dataset used is Eeg Emotion which contains 2458 unique variables. In terms of performance, LSTM succeeded in outperforming GRU in the task of classifying emotional data based on EEG signals. On the other hand, GRU shows advantages in accelerating the training process compared to LSTM. Although the accuracy of both methods is almost similar in all data divisions, in the evaluation of the ROC curve, the LSTM model demonstrates superiority with a more optimal curve compared to GRU.
Copyrights © 2023