Claim Missing Document
Check
Articles

Found 3 Documents
Search

An enhanced fletcher-reeves-like conjugate gradient methods for image restoration Hassan, Basim A.; Jabbar, Hawraz N.; Laylani, Yoksal A.; Rahman Moghrabi, Issam Abdul; Alissa, Ali Joma
International Journal of Electrical and Computer Engineering (IJECE) Vol 13, No 6: December 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v13i6.pp6268-6276

Abstract

Noise is an unavoidable aspect of modern camera technology, causing a decline in the overall visual quality of the images. Efforts are underway to diminish noise without compromising essential image features like edges, corners, and other intricate structures. Numerous techniques have already been suggested by many researchers for noise reduction, each with its unique set of benefits and drawbacks. Denoising images is a basic challenge in image processing. We describe a two-phase approach for removing impulse noise in this study. The adaptive median filter (AMF) for salt-and-pepper noise identifies noise candidates in the first phase. The second step minimizes an edge-preserving regularization function using a novel hybrid conjugate gradient approach. To generate the new improved search direction, the new algorithm takes advantage of two well-known successful conjugate gradient techniques. The descent property and global convergence are proven for the new methods. The obtained numerical results reveal that, when applied to image restoration, the new algorithms are superior to the classical fletcher reeves (FR) method in the same domain in terms of maintaining image quality and efficiency.
NUMERICAL AND CONVERGENCE ANALYSIS OF AN ENHANCED DAI-LIAO METHOD FOR UNCONSTRAINED OPTIMIZATION Hassan, Basim A.; Sulaiman, Ibrahim Mohammed; Subhi, Yeldez J.
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 19 No 4 (2025): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol19iss4pp2993-3004

Abstract

Iterative algorithms play an important role in mathematical optimization, particularly in solving large-scale unconstrained optimization problems. The conjugate gradient (CG) methods are widely used due to their low memory requirements and efficiency. However, their performance highly depends on the choice of parameters that influence search directions and convergence speed. Despite their advantages, traditional CG algorithms sometimes suffer from slow convergence or poor accuracy, especially for ill-conditioned problems. The selection of conjugate gradient parameters significantly influences the performance, and there is a need to develop improved strategies to enhance solution accuracy and efficiency. This study constructs a new conjugate gradient parameter using the curvature condition to refine search directions and accelerate convergence. The proposed approach ensures a more effective balance between descent properties and numerical stability. Preliminary numerical experiments demonstrate that the proposed method outperforms classical CG variants regarding convergence rate and accuracy. The improved search directions lead to faster and more reliable optimization solutions. The newly developed conjugate gradient formula contributes to a more robust and efficient optimization. This advancement enhances the applicability of CG methods in solving complex optimization problems, paving the way for more effective computational efficiency.
NEW CONJUGATE GRADIENT METHOD FOR ACCELERATED CONVERGENCE AND COMPUTATIONAL EFFICIENCY IN UNCONSTRAINED OPTIMIZATION PROBLEMS Hassan, Basim A.; Ibrahim, Alaa Luqman; Ameen, Thaair A.; Sulaiman, Ibrahim Mohammed
BAREKENG: Jurnal Ilmu Matematika dan Terapan Vol 20 No 1 (2026): BAREKENG: Journal of Mathematics and Its Application
Publisher : PATTIMURA UNIVERSITY

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30598/barekengvol20iss1pp0481-0492

Abstract

Conjugate gradient (CG) algorithms play an important role in solving large-scale unconstrained optimization problems due to their low memory requirements and strong convergence properties. However, many classical CG algorithms suffer from inefficiencies when dealing with complex or ill-conditioned objective functions. This paper addresses this challenge by proposing a new conjugate gradient method that combines the descent direction of traditional CG algorithms with Newton-type updates to improve convergence and computational efficiency. The proposed method is constructed to ensure sufficient descent at each iteration and global convergence under standard assumptions. By integrating the modified Newton update mechanism, the method effectively accelerates convergence without incurring the high computational cost typically associated with full Newton methods. To evaluate the performance of the proposed approach, we conducted extensive numerical experiments using a collection of well-known benchmark functions from the CUTEr test suite. The results show that the new method consistently outperforms the classical Hestenes-Stiefel method in terms of CPU time, number of function evaluations, and iteration count. These findings confirm the method’s potential as an efficient and robust alternative for solving large-scale unconstrained optimization problems.