Claim Missing Document
Check
Articles

Found 1 Documents
Search

Enhancing Text Classification Performance: A Comparative Study of RNN and GRU Architectures with Attention Mechanisms Yulita Ayu Wardani; Mery Oktaviyanti Puspitaningtyas; Happid Ridwan Ilmi; Onesinus Saut Parulian
Journal of Applied Research In Computer Science and Information Systems Vol. 2 No. 2 (2024): December 2024
Publisher : PT. BERBAGI TEKNOLOGI SEMESTA

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.61098/jarcis.v2i2.187

Abstract

Text classification plays a crucial role in natural language processing, and enhancing its performance is an ongoing area of research. This study investigates the impact of integrating attention mechanisms into a recurrent neural network (RNN) based architectures, including RNN, LSTM, GRU, and their bidirectional variants (BiLSTM and BiGRU), for text sentiment analysis. Three attention mechanisms Multihead Attention, Self Attention, and Adaptive Attention are applied to evaluate their effectiveness in improving model accuracy. The results reveal that attention mechanisms significantly enhance performance by enabling models to focus on the most relevant parts of the input text. Among the tested configurations, the LSTM model with Multihead Attention achieved the highest accuracy of 68.34%. The findings underscore the critical role of attention mechanisms in overcoming traditional RNN limitations, such as difficulty in capturing long-term dependencies, and highlight the potential for their application in broader text classification tasks.