Short-Term Load Forecasting (STLF) was a critical task in power system operations, enabling efficient energy management and planning. This study presented a comparative analysis of five machine learning models namely XGBoost, Random Forest, Multi-Layer Perceptron (MLP), Support Vector Regression (SVR), and LightGBM using real-world electricity demand data collected over a four-month period. Two modeling approaches were explored: one using only time-based features (hour, day of the week, month), and another incorporating historical lag features (lag_1, lag_2, lag_3) to capture temporal patterns. The results showed that MLP with lag features achieved the best performance (RMSE: 57.63, MAE: 34.54, MAPE: 0.22), highlighting its ability to model nonlinear and sequential dependencies. In contrast, SVR and LightGBM experienced performance degradation when lag features were added, suggesting sensitivity to feature dimensionality and data volume. These findings emphasized the importance of model-feature alignment and temporal context in improving forecasting accuracy. Future work could explore the integration of external variables such as weather and holidays, as well as the application of advanced deep learning architectures like LSTM or hybrid models to further enhance robustness and generalizability.
Copyrights © 2025