Claim Missing Document
Check
Articles

Found 1 Documents
Search

Comparative Performance of Transformer Models for Cultural Heritage in NLP Tasks Tri Lathif Mardi Suryanto; Aji Prasetya Wibawa; Hariyono Hariyono; Andrew Nafalski
Advance Sustainable Science Engineering and Technology Vol. 7 No. 1 (2025): November-January
Publisher : Science and Technology Research Centre Universitas PGRI Semarang

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26877/asset.v7i1.1211

Abstract

AI and Machine Learning are crucial in advancing technology, especially for processing large, complex datasets. The transformer model, a primary approach in natural language processing (NLP), enables applications like translation, text summarization, and question-answer (QA) systems. This study compares two popular transformer models, FlanT5 and mT5, which are widely used yet often struggle to capture the specific context of the reference text. Using a unique Goddess Durga QA dataset with specialized cultural knowledge about Indonesia, this research tests how effectively each model can handle culturally specific QA tasks. The study involved data preparation, initial model training, ROUGE metric evaluation (ROUGE-1, ROUGE-2, ROUGE-L, and ROUGE-Lsum), and result analysis. Findings show that FlanT5 outperforms mT5 on multiple metrics, making it better at preserving cultural context. These results are impactful for NLP applications that rely on cultural insight, such as cultural preservation QA systems and context-based educational platforms.