Claim Missing Document
Check
Articles

Found 1 Documents
Search

The Investigation of Convolution Layer Structure on BERT-C-LSTM for Topic Classification of Indonesian News Headlines Fabillah, Dzakira; Auliarahmi, Rizka; Setiarini, Siti Dwi; Gelar, Trisna
Journal of Software Engineering, Information and Communication Technology (SEICT) Vol 4, No 2: Desember 2023
Publisher : Universitas Pendidikan Indonesia (UPI)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.17509/seict.v4i2.63742

Abstract

An efficient and accurate method for classifying news articles based on their topics is essential for various applications, such as personalized news recommendation systems and market research. Manual classification methods are tedious, prompting the use of deep learning techniques in this study to automate the process. The developed model, BERT-C-LSTM, combines BERT, the convolutional layer from CNN, and LSTM, leveraging their individual strengths. BERT excels at transforming text into context-dependent vector representations, The design of the classification model employs a blend of convolutional layers and LSTM, referred to as C-LSTM. The convolutional layer possesses the capability to extract salient elements, including keywords and phrases, from input data. On the other hand, the Long Short-Term Memory (LSTM) model exhibits the ability to comprehend the temporal context present in sequential data. This study aims to investigate the influence of the convolutional layer structure in BERT-C-LSTM on the classification of Indonesian news headline categorized into eight topics. The results indicate that there are no significant differences in accuracy between BERT-C-LSTM model architectures with a single convolutional layer and multiple parallel convolutional layers and the models using various filter sizes. Furthermore, the BERT-C-LSTM model achieves an accuracy that is not much different from the BERT-LSTM and BERT-CNN models, with accuracies reaching 92.6%, 92.1%, and 92.7%, respectively.