Iterative algorithms play an important role in mathematical optimization, particularly in solving large-scale unconstrained optimization problems. The conjugate gradient (CG) methods are widely used due to their low memory requirements and efficiency. However, their performance highly depends on the choice of parameters that influence search directions and convergence speed. Despite their advantages, traditional CG algorithms sometimes suffer from slow convergence or poor accuracy, especially for ill-conditioned problems. The selection of conjugate gradient parameters significantly influences the performance, and there is a need to develop improved strategies to enhance solution accuracy and efficiency. This study constructs a new conjugate gradient parameter using the curvature condition to refine search directions and accelerate convergence. The proposed approach ensures a more effective balance between descent properties and numerical stability. Preliminary numerical experiments demonstrate that the proposed method outperforms classical CG variants regarding convergence rate and accuracy. The improved search directions lead to faster and more reliable optimization solutions. The newly developed conjugate gradient formula contributes to a more robust and efficient optimization. This advancement enhances the applicability of CG methods in solving complex optimization problems, paving the way for more effective computational efficiency.
Copyrights © 2025