Nonlinear conjugate gradient (CG) methods are extensively used as an important technique for addressing large-scale unconstrained optimization problems which are arise in many aspects of science, engineering, and economics. That is due to their simplicity, convergence properties, and low memory requirements. To generate a new approximation solution in each iteration, the CG methods usually implement under the strong Wolfe line search. For good performance, many studies have been carried out to modify well-known CG methods. In this paper, we did some modifications on one of CG method called RMIL+ in order to obtain a new CG method possesses the sufficient descent property and the global convergence under strong Wolfe line search. The numerical results demonstrate that the suggested method outperforms other CG methods.
Copyrights © 2022