Multicollinearity in feature-based time series regression arises as a structural consequence of lagged and rolling feature construction. However, existing studies on Ridge and ElasticNet regularization adopt an accuracy-driven evaluation paradigm, with limited attention to parameter stability, shrinkage behavior, and sensitivity to regularization strength. This study shifts the evaluation of regularized linear models from predictive accuracy toward stability-oriented assessment. Using daily electricity consumption data from the UCI Repository, Linear Regression, Ridge, and ElasticNet models are examined under engineered temporal features derived from stability-based lag pruning, rolling statistics, and correlation-informed feature selection. Model evaluation focuses on bias–variance behavior, coefficient shrinkage, regularization sensitivity, and training–testing performance gaps. The results show that regularization improves stability, with the performance gap decreasing from 0.0961 in Linear Regression to 0.0608 under ElasticNet. These comparisons show that regularization stabilizes regression models via distinct shrinkage mechanisms, informing model selection beyond accuracy. Ridge exhibits conservative shrinkage averaging 6.06%, whereas ElasticNet induces stronger shrinkage averaging 46.32% and shows higher sensitivity to penalty strength. These findings provide methodological evidence that regularization in feature-based time series regression should be treated as a stability strategy rather than an accuracy optimization tool, offering guidance for electricity load forecasting under structurally redundant temporal features.
Copyrights © 2026