Claim Missing Document
Check
Articles

Found 1 Documents
Search

Contextual Relevance-Driven Question Answering Generation: Experimental Insights Using Transformer-Based Models Suryanto, Tri Lathif Mardi; Wibawa, Aji Prasetya; Hariyono, Hariyono; Shili, Hechmi
International Journal of Engineering, Science and Information Technology Vol 5, No 4 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i4.989

Abstract

This study investigates the impact of contextual relevance and hyperparameter tuning on the performance of Transformer-based models in Question-Answer Generation (QAG). Utilising the FlanT5 model, experiments were conducted on a domain-specific dataset to assess how variations in learning rate and training epochs affect model accuracy and generalisation. Six QAG models were developed (QAG-A to QAG-F), each evaluated using ROUGE metrics to measure the quality of generated question-answer pairs. Results show that QAG-F and QAG-D achieved the highest performance, with QAG-F reaching a ROUGE-LSum of 0.4985. The findings highlight that careful tuning of learning rates and training duration significantly improves model performance, enabling more accurate and contextually appropriate question generation. Furthermore, the ability to generate both questions and answers from a single input enhances the interactivity and utility of NLP systems, particularly in knowledge-intensive domains. This study underscores the importance of contextual modelling and hyperparameter optimisation in generative NLP tasks, offering practical insights for improving chatbot development, educational tools, and digital heritage applications.