This study aims to detect students' emotions during lecture hours using the YOLO (You Only Look Once) method. Emotions influence learning success, where positive emotions can enhance motivation and understanding, while negative emotions can hinder the learning process. This research employs an artificial intelligence-based video analysis approach to recognize students' facial expressions in real-time. The research stages include data acquisition using lecture videos, data preprocessing through annotation and labeling with bounding boxes, and the implementation of the YOLO method to detect three emotion categories: Enthusiastic, Confused, and Bored. Evaluation was conducted using precision, recall, and mean average precision (mAP) metrics. The test results showed that the model achieved an overall accuracy of 91.7%, with the best performance in the Enthusiastic category (97.0% accuracy) and good performance in the Bored category (93.4%). However, the model failed to detect the Confused emotion (0.0% accuracy), indicating the need for additional training data. This study demonstrates that the YOLO method has the potential to assist lecturers in understanding students' emotional states, enabling more adaptive teaching. Further development is needed to improve accuracy across all emotion categories and ensure the system functions optimally.
Copyrights © 2025