Indonesian Journal of Electrical Engineering and Computer Science
Vol 22, No 3: June 2021

Two-versions of descent conjugate gradient methods for large-scale unconstrained optimization

Hawraz N. Jabbar (University of Kirkuk)
Basim A. Hassan (University of Mosul)



Article Info

Publish Date
01 Jun 2021

Abstract

The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.

Copyrights © 2021