Jurnal INFOTEL
Vol 17 No 3 (2025): August

Memeriksa Mekanisme Perhatian dalam Hybrid Deep Learning untuk Analisis Sentimen di Seluruh Panjang Teks

Aqilla, Livia Naura (Unknown)
Sibaroni, Yuliant (Unknown)



Article Info

Publish Date
31 Aug 2025

Abstract

Sentiment analysis is a key task in natural language processing (NLP) with applications in a wide range of domains. This study examines the impact of self-attention and global attention placement in CNN-BiLSTM and CNN-LSTM models, exploring their effectiveness when positioned before, after or both before and after BiLSTM/LSTM, particularly for texts of different lengths. Instead of applying attention mechanisms in a fixed position, this research explores the most suitable type and placement of attention to improve model understanding and adaptability across datasets with different text lengths. Experiments were conducted using the IMDB Movie Reviews Dataset and the Twitter US Airline Sentiment dataset. The results show that for long texts, CNN-BiLSTM with self-attention before and after BiLSTM achieves an F1 score of 93. 77% (+2. 72%), while for short texts, it reaches 82.70% (+2.24%). These findings highlight that optimal attention placement significantly improves sentiment classification accuracy. The study provides insights into designing more effective hybrid deep learning models. It contributes to future research on multilingual and multi-domain sentiment analysis, where attention mechanisms can be adapted to different textual structures.

Copyrights © 2025






Journal Info

Abbrev

infotel

Publisher

Subject

Computer Science & IT Electrical & Electronics Engineering

Description

Jurnal INFOTEL is a scientific journal published by Lembaga Penelitian dan Pengabdian Masyarakat (LPPM) of Institut Teknologi Telkom Purwokerto, Indonesia. Jurnal INFOTEL covers the field of informatics, telecommunication, and electronics. First published in 2009 for a printed version and published ...