Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : Jurnal Teknik Informatika (JUTIF)

Development of a Distributed Gradient Boosting Forest Algorithm with Residual Connections in Data Classification Respati, Rayhan Dhafir; Soim, Sopian; Fadhli, Mohammad
Jurnal Teknik Informatika (Jutif) Vol. 6 No. 4 (2025): JUTIF Volume 6, Number 4, Agustus 2025
Publisher : Informatika, Universitas Jenderal Soedirman

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52436/1.jutif.2025.6.4.4899

Abstract

The growing complexity and volume of data across various domains necessitate machine learning models that are scalable and robust for large-scale classification tasks. Ensemble methods such as Gradient Boosting Decision Trees (GBDT) demonstrate effectiveness; however, they encounter issues concerning scalability and training stability when applied to very deep architectures. This work presents a novel enhancement using residual connections derived from deep neural networks into the Distributed Gradient Boosting Forest (DGBF) algorithm. By enabling direct gradient propagation across layers, residual connections solve the vanishing gradient problem and so improve gradient flow, accelerate convergence, and stabilise the training process. The Residual DGBF model was assessed using seven distinct datasets across the domains of cybersecurity, financial fraud, phishing, and malware detection. The Residual DGBF consistently surpassed the baseline DGBF in terms of accuracy, precision, recall, and F1-score across all datasets. Particularly in datasets marked by imbalanced classes and complex feature interactions, this suggests improved generalisation and higher predictive accuracy. By proving more stable and strong gradients across the depth of the model, layer-wise gradient magnitude analysis supports these improvements and so confirms the effectiveness of residual connections in deep ensemble learning. This work improves ensemble techniques by combining the scalability and interpretability of decision tree ensembles with the residual architecture optimising benefits. The proposed Residual DGBF enables future research on enhanced deep boosting frameworks by offering a strong and scalable method to address challenging real-world classification tasks.