The widespread adoption of electronic learning (e-learning) in higher education has brought significant changes to how knowledge is delivered. Despite its advantages, many implementations remain focused solely on content dissemination, often neglecting learners’ emotional engagement. Emotional states, particularly in academic contexts, influence concentration, motivation, and comprehension. One of the most effective and intuitive indicators of emotion is facial expression. This research investigates the use of Convolutional Neural Networks (CNN), a deep learning approach, to automatically detect student emotions through facial image analysis. A dataset of facial expressions was constructed and divided into training and testing sets, each containing five distinct emotional categories: anger, happiness, fear, neutrality, and surprise. The CNN model was trained for 100 epochs, resulting in a training accuracy of 89% and a testing accuracy of 88%. These results demonstrate that CNN-based emotion recognition has strong potential to enhance e-learning platforms by providing instructors with real-time emotional insights. By integrating emotional feedback, educators can adapt instructional strategies more effectively to improve student engagement and learning outcomes. This study contributes to the growing field of affective computing and emphasizes the importance of emotional awareness in digital learning environments.
Copyrights © 2025