Estimating normal distribution parameters is one of the fundamental problems in statistics that has attracted the attention of researchers for more than two centuries. This study aims to analyse and compare the performance of six normal distribution parameter estimation methods, which include Method of Moments (MoM), Maximum Likelihood Estimation (MLE), Least Squares Estimation (LSE), Bayesian Estimation, Percentile Matching, and Generalised Method of Moments (GMM), through a systematic and comprehensive approach. The methodology of this study combines rigorous theoretical mathematical derivation for each estimation method with empirical evaluation through extensive Monte Carlo simulations. Each method was derived mathematically from the basic principle to the final estimator formula, then implemented in a simulation with 500 replications on various sample sizes, i.e. n = 30, 50, 100, and 200, of the normal distribution of N (5, 4). In small samples with n = 30, Percentile Matching showed the highest MSE for parameter estimation with a value of 0.162, while MoM, MLE, and LSE showed the best performance with MSE for μ parameter of 0.127 and MSE for parameter 0.067. The main conclusions of this study show that MLE provides an optimal balance between statistical accuracy and computational efficiency for most practical applications. Bayesian Estimation shows good stability at all sample sizes and is superior when informative prior information is available.
Copyrights © 2025