Gerasimov, Roman
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

An algorithm for training neural networks with L1 regularization Gribanova, Ekaterina; Gerasimov, Roman
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 5: October 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i5.pp3781-3789

Abstract

This paper presents a new algorithm for building neural network models that automatically selects the most important features and parameters while improving prediction accuracy. Traditional neural networks often use all available input parameters, leading to complex models that are slow to train and prone to overfitting. The proposed algorithm addresses this challenge by automatically identifying and retaining only the most significant parameters during training, resulting in simpler, faster, and more accurate models. We demonstrate the practical benefits of the proposed algorithm through two real-world applications: stock market forecasting using the Wilshire index and business profitability prediction based on company financial data. The results show significant improvements over conventional methods: models use fewer parameters–creating simpler, more interpretable solutions–achieve better prediction accuracy, and require less training time. These advantages make the algorithm particularly valuable for business applications where model simplicity, speed, and accuracy are crucial. The method is especially beneficial for organizations with limited computational resources or that require fast model deployment. By automatically selecting the most relevant features, it reduces the need for manual feature engineering and helps practitioners build more efficient predictive models without requiring deep technical expertise in neural network optimization.