Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : JURNAL INSTEK (Informatika Sains dan Teknologi)

DOES PERSONALIZATION MATTER IN PROMPTING? A CASE STUDY OF CLASSIFYING PAPER METADATA USING ZERO-SHOT PROMPTING Lesmana, Chandra; Muhammad Okky Ibrohim; Indra Budi
Jurnal INSTEK (Informatika Sains dan Teknologi) Vol 10 No 1 (2025): APRIL
Publisher : Department of Informatics Engineering, Faculty of Science and Technology, Universitas Islam Negeri Alauddin, Makassar, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.24252/instek.v10i1.57445

Abstract

Systematic Literature Review (SLR) is one way for researchers to obtain information on research developments on a topic in a structured manner. This makes SLR a preferred method by researchers because the process involves systematic, objective analysis and focuses on answering research questions. In general, there are three stages to conducting SLR, namely planning, implementation, and reporting. However, compiling an SLR takes a long time because it goes through all the stages one by one. To overcome this problem, an automation process is needed so that it can speed up the SLR compilation process. Previous studies have carried out an automation process in the form of SLR document classification by utilizing several machine learning models that require a lot of training data like Naïve Bayes, Support Vector Machine, and Logistic Model Tree. In this study, the authors conducted an automation process by utilizing open-source Large Language Model (LLM) namely Mistral-7B-Instruct-v0.2 and LLaMA-3.1–8B to classify title and abstract of SLR documents. We compared the effect of using personalization on zero-shot prompting. By using LLM with zero-shot prompting, the classification process no longer requires training data, so that it does not need data annotation cost. Experiment results showed that personalization improved classification performance, getting the best results with Macro F1 0.5538 using the Llama 3.1 model.