Claim Missing Document
Check
Articles

Found 1 Documents
Search

A THREE-TERM CONJUGATE GRADIENT METHOD FOR LARGE-SCALE MINIMIZATION IN ARTIFICIAL NEURAL NETWORKS Omesa, Umar A; Waziri, Muhammad Y.; Moghrabi, Issam A. R.; Ibrahim, Sulaiman M.; E B, Gudu; S L, Fakai; Yunus, Rabiu Bashir; Madi, Elissa Nadia
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 19 No 3 (2025): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol19iss3pp1973-1988

Abstract

Conjugate Gradient (CG) methods are widely used for solving unconstrained optimization problems due to their efficiency and low memory requirements. However, standard CG methods may not always guarantee sufficient descent condition, which can impact their robustness and convergence behavior. Additionally, their effectiveness in training artificial neural networks (ANNs) remains an area of interest. In response, this paper presents a three-term conjugate gradient (CG) method for unconstrained optimization problems. The new parameter is formulated so that the search direction satisfies the sufficient descent condition. The global convergence result of the new algorithm is discussed under suitable assumptions. To evaluate the performance of the new method we considered some standard test problems for unconstrained optimization and applied the proposed method to train different ANNs on some benchmark data sets contained in the NN toolbox. The experimental results show that performance is encouraging for both unconstrained minimization test problems and in training neural networks.