This study investigates the effectiveness of dropout layers in reducing overfitting within Long Short-Term Memory (LSTM) neural networks for Sony stock price prediction. Financial time series forecasting presents significant challenges due to market volatility and noise, often leading to models that overfit historical data while failing to generalize to unseen market conditions. We implemented two LSTM models: one without dropout layers and another with dropout layers (rate=0.2) applied after each LSTM layer. Using historical Sony stock data from 2015-2025, we evaluated both models using Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE) metrics. The model with dropout demonstrated superior performance on testing data, achieving RMSE of 0.5971, MAE of 0.4411, and MAPE of 2.1502%, compared to the model without dropout which obtained RMSE of 0.7124, MAE of 0.5636, and MAPE of 2.6684%. Furthermore, the dropout model exhibited significantly reduced overfitting, with smaller performance gaps between training and testing datasets across all metrics, particularly in MAPE where the difference approached zero (0.0509%). This research provides empirical evidence that dropout regularization effectively enhances LSTM model generalization for stock prediction, offering practical value for developing more reliable financial forecasting models. Future research could explore optimal dropout rates for different market conditions and investigate combinations of dropout with other regularization techniques.
Copyrights © 2025