In nonlinear unconstrained optimization problems, methods involving the gradient of a function are used, allowing the function’s value to increase or decrease at the fastest rate. One of the gradient-based methods is the Conjugate Gradient method, which has been extensively modified, with one of the well-known variants being the Fletcher-Reeves method. However, in many cases, the application of these methods doesn’t always achieve the correct descent direction, affecting the speed and convergence of the method, so that modifications arise due to these deficiencies. The purpose of this study is to examine the process of forming the formula of a modified Fletcher-Reeves method, develop the algorithm, and analyze the global convergence. The results of numerical simulation tests show that by selecting the appropriate value the modified Fletcher-Reeves method converges to the global minimum solution and can find it faster than the Fletcher-Reeves method.
Copyrights © 2024