Claim Missing Document
Check
Articles

Found 1 Documents
Search

HYBRID INTEGRATION OF BERT AND BILSTM MODELS FOR SENTIMENT ANALYSIS Tambunan, Nicolas Ray Amarco; Saputro, Dewi Retno Sari; Widyaningsih, Purnami
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 20 No 2 (2026): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol20iss2pp1719-1730

Abstract

The rapid growth of sentiment analysis research has driven increasing interest in deep learning models, particularly transformer-based architectures such as BERT and recurrent neural networks like BiLSTM. While both approaches have shown substantial success in text classification tasks, each presents distinct strengths and limitations. This study aims to analyze the integration of BERT and BiLSTM models to enhance sentiment classification performance by combining contextual and sequential learning. A bibliometric analysis was conducted using VosViewer based on Scopus-indexed publications from 2020 to 2025, identifying four major thematic clusters related to transformer modeling, recurrent architectures, hybrid integration, and methodological advancements. Comparative findings from benchmark datasets, including SST-2, IMDb, and Yelp Reviews, indicate that hybrid BERT–BiLSTM models achieve superior accuracy compared to single models, reaching up to 97.67% on the IMDb dataset. However, this improvement is associated with increased computational complexity. The proposed framework reinforces the integration between BERT’s contextual embeddings and BiLSTM’s sequential modeling, offering a foundation for developing adaptive, and multilingual sentiment analysis systems. The results highlight future directions in optimizing hybrid architectures for efficiency, cross-lingual adaptability, and domain-specific sentiment understanding.