Human emotions can be expressed through facial expressions, and automatic recognition has a wide range of applications, from human and computer interaction to behavior analysis. Researchers developed a YOLO-based model that was trained to recognize various basic emotions such as happy, sad, angry, and surprised. The dataset used includes various facial images with corresponding emotion labels. This research produced a web to detect human faces using the YOLO algorithm in realtime. A total of 400 photos were used in the analysis; these images were separated into 4 classes: happy, sad, angry, and surprised. Of the 400 images, 70% are training images, 20% are validation images, and 10% are test images. There were 200 epochs of training data, which resulted in a new model. The validation rate of the mAP is 90%, the final score of the model shows that the object identification accuracy of the YOLOv8 model on facial expressions is at the highest point. The experimental results show that the YOLO method is able to detect and classify emotions with a high degree of accuracy. These results demonstrate its advantages in speed and efficiency compared to other more conventional methods. This implementation opens up opportunities for further development in real-time applications that allow the YOLO method to be used in a variety of applications.
Copyrights © 2024