Indonesian Journal of Electrical Engineering and Computer Science
Vol 33, No 1: January 2024

A modified type of Fletcher-Reeves conjugate gradient method with its global convergence

Amna Weis Mohammed Ahmad Idress (Faculty of Mathematical and Computer Sciences, University of Gezira)
Osman Omer Osman Yousif (Faculty of Mathematical and Computer Sciences, University of Gezira)
Abdulgader Zaid Almaymuni (College of Scinece and Arts in Ar Rass, Qassim University)
Awad Abdelrahman Abdalla Mohammed (Faculty of Mathematical and Computer Sciences, University of Gezira)
Mohammed A. Saleh (College of Scinece and Arts in Ar Rass, Qassim University)
Nafisa A. Ali (Faculty of Mathematical and Computer Sciences, University of Gezira)



Article Info

Publish Date
01 Jan 2024

Abstract

The conjugate gradient methods are one of the most important techniques used to address problems involving minimization or maximization, especially nonlinear optimization problems with no constraints at all. That is because of their simplicity and low memory needed. They can be applied in many areas, such as economics, engineering, neural networks, image restoration, machine learning, and deep learning. The convergence of Fletcher-Reeves (FR) conjugate gradient method has been established under both exact and strong Wolfe line searches. However, it is performance in practice is poor. In this paper, to get good numerical performance from the FR method, a little modification is done. The global convergence of the modified version has been established for general nonlinear functions. Preliminary numerical results show that the modified method is very efficient in terms of number of iterations and CPU time.

Copyrights © 2024