Journal of Applied Research In Computer Science and Information Systems
Vol. 2 No. 2 (2024): December 2024

Enhancing Text Classification Performance: A Comparative Study of RNN and GRU Architectures with Attention Mechanisms

Yulita Ayu Wardani (Unknown)
Mery Oktaviyanti Puspitaningtyas (Unknown)
Happid Ridwan Ilmi (Unknown)
Onesinus Saut Parulian (Unknown)



Article Info

Publish Date
30 Dec 2024

Abstract

Text classification plays a crucial role in natural language processing, and enhancing its performance is an ongoing area of research. This study investigates the impact of integrating attention mechanisms into a recurrent neural network (RNN) based architectures, including RNN, LSTM, GRU, and their bidirectional variants (BiLSTM and BiGRU), for text sentiment analysis. Three attention mechanisms Multihead Attention, Self Attention, and Adaptive Attention are applied to evaluate their effectiveness in improving model accuracy. The results reveal that attention mechanisms significantly enhance performance by enabling models to focus on the most relevant parts of the input text. Among the tested configurations, the LSTM model with Multihead Attention achieved the highest accuracy of 68.34%. The findings underscore the critical role of attention mechanisms in overcoming traditional RNN limitations, such as difficulty in capturing long-term dependencies, and highlight the potential for their application in broader text classification tasks.

Copyrights © 2024






Journal Info

Abbrev

JARCIS

Publisher

Subject

Computer Science & IT Control & Systems Engineering Decision Sciences, Operations Research & Management Industrial & Manufacturing Engineering

Description

Journal of Applied Research In Computer Science and Information Systems (JARCIS) is dedicated to publishing and disseminating research results and theoretical discussions, applied analysis, and literature studies in the fields of information technology, computer science, and information systems. The ...