Claim Missing Document
Check
Articles

Found 2 Documents
Search

NUMERICAL AND CONVERGENCE ANALYSIS OF AN ENHANCED DAI-LIAO METHOD FOR UNCONSTRAINED OPTIMIZATION Hassan, Basim A.; Sulaiman, Ibrahim Mohammed; Subhi, Yeldez J.
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 19 No 4 (2025): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol19iss4pp2993-3004

Abstract

Iterative algorithms play an important role in mathematical optimization, particularly in solving large-scale unconstrained optimization problems. The conjugate gradient (CG) methods are widely used due to their low memory requirements and efficiency. However, their performance highly depends on the choice of parameters that influence search directions and convergence speed. Despite their advantages, traditional CG algorithms sometimes suffer from slow convergence or poor accuracy, especially for ill-conditioned problems. The selection of conjugate gradient parameters significantly influences the performance, and there is a need to develop improved strategies to enhance solution accuracy and efficiency. This study constructs a new conjugate gradient parameter using the curvature condition to refine search directions and accelerate convergence. The proposed approach ensures a more effective balance between descent properties and numerical stability. Preliminary numerical experiments demonstrate that the proposed method outperforms classical CG variants regarding convergence rate and accuracy. The improved search directions lead to faster and more reliable optimization solutions. The newly developed conjugate gradient formula contributes to a more robust and efficient optimization. This advancement enhances the applicability of CG methods in solving complex optimization problems, paving the way for more effective computational efficiency.
NEW CONJUGATE GRADIENT METHOD FOR ACCELERATED CONVERGENCE AND COMPUTATIONAL EFFICIENCY IN UNCONSTRAINED OPTIMIZATION PROBLEMS Hassan, Basim A.; Ibrahim, Alaa Luqman; Ameen, Thaair A.; Sulaiman, Ibrahim Mohammed
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 20 No 1 (2026): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol20iss1pp0481-0492

Abstract

Conjugate gradient (CG) algorithms play an important role in solving large-scale unconstrained optimization problems due to their low memory requirements and strong convergence properties. However, many classical CG algorithms suffer from inefficiencies when dealing with complex or ill-conditioned objective functions. This paper addresses this challenge by proposing a new conjugate gradient method that combines the descent direction of traditional CG algorithms with Newton-type updates to improve convergence and computational efficiency. The proposed method is constructed to ensure sufficient descent at each iteration and global convergence under standard assumptions. By integrating the modified Newton update mechanism, the method effectively accelerates convergence without incurring the high computational cost typically associated with full Newton methods. To evaluate the performance of the proposed approach, we conducted extensive numerical experiments using a collection of well-known benchmark functions from the CUTEr test suite. The results show that the new method consistently outperforms the classical Hestenes-Stiefel method in terms of CPU time, number of function evaluations, and iteration count. These findings confirm the method’s potential as an efficient and robust alternative for solving large-scale unconstrained optimization problems.