This paper presents a new algorithm for building neural network models that automatically selects the most important features and parameters while improving prediction accuracy. Traditional neural networks often use all available input parameters, leading to complex models that are slow to train and prone to overfitting. The proposed algorithm addresses this challenge by automatically identifying and retaining only the most significant parameters during training, resulting in simpler, faster, and more accurate models. We demonstrate the practical benefits of the proposed algorithm through two real-world applications: stock market forecasting using the Wilshire index and business profitability prediction based on company financial data. The results show significant improvements over conventional methods: models use fewer parameters–creating simpler, more interpretable solutions–achieve better prediction accuracy, and require less training time. These advantages make the algorithm particularly valuable for business applications where model simplicity, speed, and accuracy are crucial. The method is especially beneficial for organizations with limited computational resources or that require fast model deployment. By automatically selecting the most relevant features, it reduces the need for manual feature engineering and helps practitioners build more efficient predictive models without requiring deep technical expertise in neural network optimization.
Copyrights © 2025