This study proposes the Entropy-Regularized NARX (ER-NARX) model, which integrates nonlinear autoregressive modeling, entropy-based regularization, and information-theoretic learning for big data forecasting. The NARX model captures temporal dependencies between past outputs and exogenous inputs, while entropy regularization is incorporated to control the uncertainty of model predictions and prevent overfitting. The innovation of this model is its ability to control information flow through entropy regularization, which helps balance predictive accuracy with uncertainty, preventing the model from becoming overly deterministic. By combining these components, the ER-NARX model enhances the stability and robustness of the forecasts and improves its generalization to complex, high-dimensional data. Additionally, fractional dynamics are employed to model long-range memory effects in temporal data to enhancing the model's ability to handle datasets with extended dependencies. The resulting ER-NARX framework provides a mathematically grounded approach to big data forecasting improved performance in a computationally efficient manner. Future research may explore advanced entropy regularization techniques and apply the model to more diverse real-world data with intricate dependencies.