Claim Missing Document
Check
Articles

Found 1 Documents
Search

NEW CONJUGATE GRADIENT METHOD FOR ACCELERATED CONVERGENCE AND COMPUTATIONAL EFFICIENCY IN UNCONSTRAINED OPTIMIZATION PROBLEMS Hassan, Basim A.; Ibrahim, Alaa Luqman; Ameen, Thaair A.; Sulaiman, Ibrahim Mohammed
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 20 No 1 (2026): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol20iss1pp0481-0492

Abstract

Conjugate gradient (CG) algorithms play an important role in solving large-scale unconstrained optimization problems due to their low memory requirements and strong convergence properties. However, many classical CG algorithms suffer from inefficiencies when dealing with complex or ill-conditioned objective functions. This paper addresses this challenge by proposing a new conjugate gradient method that combines the descent direction of traditional CG algorithms with Newton-type updates to improve convergence and computational efficiency. The proposed method is constructed to ensure sufficient descent at each iteration and global convergence under standard assumptions. By integrating the modified Newton update mechanism, the method effectively accelerates convergence without incurring the high computational cost typically associated with full Newton methods. To evaluate the performance of the proposed approach, we conducted extensive numerical experiments using a collection of well-known benchmark functions from the CUTEr test suite. The results show that the new method consistently outperforms the classical Hestenes-Stiefel method in terms of CPU time, number of function evaluations, and iteration count. These findings confirm the method’s potential as an efficient and robust alternative for solving large-scale unconstrained optimization problems.