Claim Missing Document
Check
Articles

Found 1 Documents
Search

Exploratory Data Analysis untuk Pembelajaran Daring Sinkron Berdasarkan Gambar Digital AFEA Syefrida Yulina; Mona Elviyenti
Jurnal Nasional Teknik Elektro dan Teknologi Informasi Vol 11 No 2: Mei 2022
Publisher : Departemen Teknik Elektro dan Teknologi Informasi, Fakultas Teknik, Universitas Gadjah Mada

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (1132.318 KB) | DOI: 10.22146/jnteti.v11i2.3867

Abstract

The spread of COVID-19 throughout the world has affected the education sector. In some higher education institution, such as Polytechnic Caltex Riau (PCR), it is mandatory for students to participate in synchronous or asynchronous learning activities via virtual classroom. Synchronous online learning is usually supported by video conferencing media such as Google Meeting or Zoom Meeting. The communication between lecturers and students is captured as an image as evidence of students’ interaction and participation in certain learning subjects. These images can provide information for lecturers in determining students’ internal feelings and measuring students’ interest through facial emotions. Taking this reason into account, the current research aims to analyze the emotions detected in facial expression through images using automatic facial expression analysis (AFEA) and exploratory data analysis (EDA), then visualize the data to determine the possible solution to improve the educational process’ sustainability. The AFEA steps applied were face acquisition to detect facial parts in an image, facial data extraction and representation to process feature extraction on the face, and facial expression recognition to classify faces into emotional expressions. Thus, this paper presents the results obtained from applying machine learning algorithms to classify facial expressions into happy and unhappy emotions with mean values of 5.58 and 2.70, respectively. The data were taken from the second semester of 2020/2021 academic year with 1,206 images. The result highlighted the fact that students showed the facial emotion based on the lecture types, hours, departments, and classes. It indicates that there are, in fact, several factors contributing to the variances of students’ facial emotions classified in synchronous online learning.