Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : Journal of Applied Data Sciences

Optimizing Function-Level Source Code Classification Using Meta-Trained CodeBERT in Low-Resource Settings Septiadi, Abednego Dwi; Prasetyo, Muhamad Awiet Wiedanto; Daffa, Geusan Edurais Aria
Journal of Applied Data Sciences Vol 6, No 3: September 2025
Publisher : Bright Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.47738/jads.v6i3.902

Abstract

This study investigates the effectiveness of a meta-trained transformer-based model, CodeBERT, for classifying source code functions in environments with limited labeled data. The primary objective is to improve the accuracy and generalizability of function-level code classification using few-shot learning, a strategy where the model learns from only a few labeled examples per category. We introduce a meta-learning framework designed to enable CodeBERT to adapt to new function types with minimal supervision, addressing a common limitation in traditional code classification methods that require extensive labeled datasets and manual feature engineering. The methodology involves episodic few-shot classification, where each episode simulates a low-resource task using five labeled and five unlabeled samples per function class. A balanced subset of Python functions was sampled from the CodeXGLUE benchmark, consisting of ten function categories with equal representation. The source code was preprocessed by removing comments and docstrings, then tokenized into a fixed length of 128 tokens to fit the model input format. The meta-trained CodeBERT was evaluated across 10 episodes, each representing a different task composition. Results show that the model achieves an average classification accuracy of 73.0%, with high accuracy on function categories characterized by unique syntax patterns, and lower performance on categories with overlapping logic or naming structures. Despite this variability, the model-maintained accuracy above 60% in all episodes. These findings suggest that meta-learning significantly enhances the adaptability of CodeBERT to unseen tasks under data-constrained conditions. This research demonstrates that meta-trained transformer models can serve as practical tools for real-time code analysis, particularly in integrated development environments and continuous integration pipelines. Future work may include extending the framework to other programming languages and incorporating semantic code representations to further reduce classification ambiguity.
Co-Authors Agatha Dinarah Sri Rumestri Aina Musdholifah Ajeng Shaalm, Hastin Al Hasan, Lizamuddin Al Mubarok, Yasid Alga Alfara Aloycius Ginting, Doanta Amrullah, Arif Anjani, Sarah Arif Amrulloh Aziz, Fathul Br Bangun, Elsi Titasari Chandra Setyawan Daffa, Geusan Edurais Aria Dewi, Talitha Yulistya Dhanar Intan Surya Saputra Dhany Jannati, Achmad Diah Ayu Lestari Dian Widyaningrum, Dian Dimas Fanny Hebrasianto Permadi Diovianto Putra Rakhmadani Dwi Januarita Ardianing Kusuma Dwi Krisbiantoro Eka Tripustikasari Eka Tripustikasari Eka Tripustikasari, Eka Eka Trupustikasari Faridatun Nida Galih Putra Pamungkas Galih Yuda Bintara, Ardhana Hamda, Hizbullah Hari Widi Utomo Intan Shaferi Iqbal Adrian, M Kevin Adiyansah Kuat Indartono Kurniawan, Fatah Agung Laurensius Windy Octanio Haryanto Lee Jeong Bae Lestari, Herwiek Diyah Luky Sufra Alfarizi Maie Istighosah Maie Istighosah Maryona Septiara Maryona Septiara Moh Inwan Baikuni Mohammad imron Muhamad Awiet Wiedanto Prasetyo Muliasari Pinilih Nicolaus Euclides Wahyu Nugroho Nirmala Nirmala, Nirmala Nisrina Hanifa Setiono Novian Adi Prasetyo Pangestu, Julianteri Kevin Tabah Pangki Pradana Panjiwijanarko, Jiddan Ilham Pramudya Adi Wicaksono Pratama, Rizki Yoga Purnama Sukardi Putra Ramadan, Saddam Raihan Sapuletea, Chrissandy Sarmini - Sarmini Sarmini Sarmini Septiara, , Maryona Solahudin Moh Sukmawati, Enjeli CIstia Taqwa Hariguna Umti Mardiyati Utami, Annisaa Wahyu Andi Saputra Wundari, Farhan Yudha Islami Sulistya Yudha Islami Sulistya Yudha Saintika