Claim Missing Document
Check
Articles

Found 5 Documents
Search

On the Comparison of PAR, DARMA, and INAR in Modeling Count Time Series Data Buba, Haruna; Abdulkadir, Ahmed; Lasisi, Kazeem E.; Bishir, A.; Mashat, Strong Yusuf
Mikailalsys Journal of Mathematics and Statistics Vol 3 No 3 (2025): Mikailalsys Journal of Mathematics and Statistics
Publisher : Darul Yasin Al Sys

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.58578/mjms.v3i3.6312

Abstract

This study evaluates the forecasting and fitting performance of three advanced models—Poisson Autoregressive (PAR), Discrete Autoregressive Moving Average (DARMA), and Integer-Valued Autoregressive (INAR) for count time series data exhibiting complex features such as autocorrelation, overdispersion, and zero inflation. Both simulated and empirical datasets were analyzed, and model performance was assessed using Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). The results indicate that PAR models significantly outperform DARMA and INAR models, achieving substantially lower AIC (482.53 vs. >5,310,479) and RMSE (3,742 vs. 246,682), highlighting their robustness in handling periodic trends and autocorrelation. In contrast, standard Poisson regression performs poorly under overdispersion, with an AIC approaching 5.3 million, while zero-inflated datasets compromise error metrics such as MAPE due to division by zero. Although DARMA and INAR models perform comparably, they are less effective in capturing extreme fluctuations or sudden spikes. These findings emphasize the limitations of conventional models and point to the need for more flexible approaches, such as hybrid ZIP-INAR models or Bayesian methods, to effectively manage overdispersion and zero inflation. The study concludes with a practical recommendation to prioritize PAR models when modeling autocorrelated count data.
Advances in Bayesian Approaches for Stochastic Process Modeling and Uncertainty Quantification Weng Nyam, Peter; Bishir, A.; Mukhtar, Ummi; Gali, Abubakar Muhammad; Moses, Nyango Yusuf
Mikailalsys Journal of Mathematics and Statistics Vol 3 No 3 (2025): Mikailalsys Journal of Mathematics and Statistics
Publisher : Darul Yasin Al Sys

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.58578/mjms.v3i3.7429

Abstract

Stochastic processes serve as foundational models for systems characterized by random evolution across time or space, making them essential tools in disciplines such as finance, physics, epidemiology, and environmental science. Traditional statistical methods often yield only point estimates of model parameters, limiting their capacity to capture the full scope of uncertainty inherent in such systems. In contrast, Bayesian inference offers a rigorous and comprehensive probabilistic framework by treating both parameters and stochastic processes as random variables. This approach enables the integration of prior knowledge and yields posterior distributions that encapsulate uncertainty more fully. This paper presents a comprehensive survey of Bayesian inference as applied to stochastic processes. It begins by outlining the theoretical foundations of Bayes' Theorem in this context, emphasizing the importance of prior specification for infinite-dimensional function spaces. The discussion then turns to key classes of stochastic processes—including Gaussian Processes, Markov Models, and State-Space Models—highlighting how Bayesian methods enhance their interpretability and predictive capacity. Given the complexity of posterior distributions in these models, the paper also reviews modern computational techniques such as Markov Chain Monte Carlo (MCMC) and Variational Inference (VI) that enable practical implementation. Applications across multiple domains are explored to demonstrate the flexibility and power of the Bayesian approach. The study concludes by identifying emerging challenges and outlining promising directions for future research in Bayesian inference for stochastic systems.
Methods and Applications of Point Estimation in Inferential Statistics: A Case Study of Energy Consumption Data at ATBU Yakubu, J.; Bishir, A.; Jibril, J.; Adamu, A.; James, K. Y.; Ibrahim, A. I.
Mikailalsys Journal of Mathematics and Statistics Vol 3 No 3 (2025): Mikailalsys Journal of Mathematics and Statistics
Publisher : Darul Yasin Al Sys

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.58578/mjms.v3i3.7467

Abstract

This study employs established point estimation techniques in inferential statistics—including Ordinary Least Squares (OLS), Maximum Likelihood Estimation (MLE), Ridge regression, and Lasso regression—to analyze a 30-month dataset on energy consumption, billing, and revenue collection from Abubakar Tafawa Balewa University (ATBU), Bauchi. The primary objective is to assess the accuracy and efficiency of parameter estimation methods for predicting revenue based on energy billed. Using regression-based models, the study evaluates performance across two sites: the Main Campus and the Permanent Site. Empirical findings demonstrate strong model explanatory power, with R² values of approximately 0.90 and 0.80, respectively, indicating a high degree of reliability in the predictive capacity of the models. OLS is shown to provide unbiased estimates, while regularization techniques such as Ridge and Lasso improve model robustness by addressing multicollinearity and overfitting. The results highlight the practical applicability of statistical modeling in energy revenue forecasting and offer valuable insights for institutional energy management. The study concludes by recommending the integration of regularized regression techniques for more resilient forecasting frameworks in similar energy consumption environments.
Applications of the Bayesian Methods in Clinical Trials with Large Sample Size Amani, D. J; Bishir, A.; Usman, M. A.; Amos, S.; Yelwa, A.; Nyam, Peter Weng
Mikailalsys Journal of Mathematics and Statistics Vol 4 No 1 (2026): Mikailalsys Journal of Mathematics and Statistics
Publisher : Darul Yasin Al Sys

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.58578/mjms.v4i1.7483

Abstract

Bayesian methods have gained prominence as robust alternatives to traditional frequentist approaches in the design and analysis of clinical trials, particularly those involving large sample sizes. While frequentist methods rely on fixed hypotheses and long-run probability interpretations, Bayesian frameworks incorporate prior knowledge and allow for iterative updating of evidence as data accrue. This adaptability facilitates the implementation of innovative trial structures such as adaptive designs and platform trials, while also supporting real-time decision-making. The integration of historical or external data within Bayesian analyses further enhances trial efficiency, especially in interim monitoring and interpretation of treatment effects. Despite these advantages, the broader adoption of Bayesian methods in confirmatory Phase III trials remains constrained by computational demands, challenges in the elicitation and justification of prior distributions, and varying degrees of regulatory acceptance. Nevertheless, advancements in high-performance computing, the emergence of hybrid Bayesian–frequentist methodologies, and growing regulatory engagement underscore a progressive shift toward broader implementation. This paper critically examines the evolution, methodological underpinnings, and practical applications of Bayesian approaches in large-sample clinical trials, offering a comparative assessment with frequentist methods. It also outlines key benefits, prevailing limitations, and potential trajectories for future research and regulatory alignment. These insights contribute to ongoing discourse on optimizing trial design for enhanced scientific rigor, ethical standards, and decision-making in evidence-based medicine.
A Bayesian Decision-Theoretic Framework for Optimally Managing Asymmetric Error Costs in Hypothesis Testing Daniel, John Abisi A; Bishir, A.; Ibrahim, Abdulhalim Isah; ZabiZabi, Zainab Muhammad; Gabchiya, Abubakar; Nyam, Peter Weng
Asian Journal of Science, Technology, Engineering, and Art Vol 3 No 6 (2025): Asian Journal of Science, Technology, Engineering, and Art
Publisher : Darul Yasin Al Sys

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.58578/ajstea.v3i6.7714

Abstract

The classical Neyman–Pearson paradigm of hypothesis testing mandates control of the Type I error rate (α) while maximizing power (1 − β), but this foundational approach has been widely criticized for its rigidity, reliance on arbitrary significance thresholds, and inability to formally incorporate the relative costs of different errors. This paper presents a Bayesian decision-theoretic framework as a principled alternative for optimizing the trade-off between Type I and Type II errors. By combining prior information with observed data to form a posterior distribution and minimizing a loss function that explicitly quantifies the consequences of decisions, the optimal decision rule emerges naturally and balances posterior evidence against asymmetric error costs. A detailed case study in medical diagnostics illustrates the practical advantages of this approach, demonstrating how optimal decisions change when the severity of errors is explicitly taken into account. The paper argues that the Bayesian framework provides a more coherent, flexible, and context-sensitive methodology for statistical decision-making, moving beyond the limitations imposed by a fixed α.