This study delves into comparing LSTM and GRU, two recurrent neural network (RNN) models, for classifying emotion data through electroencephalography (EEG) signals. Both models adeptly handle sequential data challenges, showcasing their unique strengths. In EEG emotion dataset experiments, LSTM demonstrated superior performance in emotion classification compared to GRU, despite GRU’s quicker training processes. Evaluation metrics encompassing accuracy, recall, F1-score, and area under the curve (AUC) underscored LSTM’s dominance, which was particularly evident in the ROC curve analysis. This research sheds light on the nuanced capabilities of these RNN models, offering valuable insights into their efficacy in emotion classification tasks based on EEG data. The study explores parameters, such as the number of layers, neurons, and the utilization of dropout, providing a detailed analysis of their impact on emotion recognition accuracy.Purpose: The proposed model in this study is the result of optimizing LSTM and GRU networks through careful parameter tuning to find the best model for classifying EEG emotion data. The experimental results indicate that the LSTM model can achieve an accuracy level of up to 100%.Methods: To improve the accuracy of the LSTM and GRU methods in this research, hyperparameter tuning techniques were applied, such as adding layers, dense layers, flattening layers, selecting the number of neurons, and introducing dropout to mitigate the risk of overfitting. The goal was to find the best model for both methods.Results: The proposed model in this study is capable of classifying EEG emotion data very effectively. The experimental results demonstrate that the LSTM model achieves a maximum accuracy of 100%, while the GRU model achieves a highest accuracy of approximately 98%.Novelty: The novelty of this research lies in the optimization of hyperparameters for both LSTM and GRU methods, leading to the development of novel architectures capable of effectively classifying EEG emotion data.
                        
                        
                        
                        
                            
                                Copyrights © 2023