Sunendar, Nendi Sunendar
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

COMPARATIVE PERFORMANCE OF TRANSFORMER AND LSTM MODELS FOR INDONESIAN INFORMATION RETRIEVAL WITH INDOBERT Sunendar, Nendi Sunendar; Saputra, Irwansyah
Jurnal Pilar Nusa Mandiri Vol. 21 No. 2 (2025): Pilar Nusa Mandiri : Journal of Computing and Information System Publishing Pe
Publisher : LPPM Universitas Nusa Mandiri

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.33480/pilar.v21i2.6920

Abstract

Neural network-based Information Retrieval (IR), particularly with Transformer models, has gained prominence in information search technology. However, the application of this technology in Indonesian, a low-resource language, remains limited. This study aims to compare the performance of the LSTM model and IndoBERT for IR tasks in Indonesian. The dataset consists of 5,000 query–document pairs collected via scraping from three Indonesian news portals: CNN Indonesia, Kompas, and Detik. Evaluation was performed using MAP, MRR, Precision@5, and Recall@5 metrics. The results show that IndoBERT outperforms LSTM in all metrics with a MAP of 0.82 and MRR of 0.84, while LSTM only reached a MAP of 0.63 and MRR of 0.65. These findings confirm that Transformer models like IndoBERT are more effective at capturing semantic relevance between queries and documents, even with limited datasets.