Emotion recognition from electroencephalogram (EEG) signals has been recognized as critical for enhancing human–computer interaction and mental health monitoring. In this paper, an explainable and real-time dual-stream deep learning framework has been proposed for EEG-based emotion classification. The model integrates a 1D convolutional neural network (1D-CNN) for local feature extraction and a transformer encoder for global dependency modeling, with multi-head attention used for feature fusion. Lightweight Gray Wolf Optimization (LGWO) has been employed for selecting optimal features, and an ensemble of lightweight classifiers has been applied to improve robustness. Experiments conducted on DEAP, SEED, BrainWave, and INTERFACE datasets have demonstrated superior performance, achieving accuracies of 96.90%, 94.25%, 93.70%, and 92.80%, respectively. An average inference delay of 5.2 milliseconds per trial has confirmed real-time applicability. Furthermore, SHAP analysis has been incorporated to interpret the model’s decision-making process by identifying influential EEG channels and frequency components. The results have validated the proposed model as a robust, accurate, and explainable solution for EEG-based emotion recognition, establishing a new benchmark for future research in affective computing and clinical applications.
Copyrights © 2025