Basim A. Hassan
University of Mosul

Published : 8 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 8 Documents
Search

Using a new type of formula conjugate on the gradient methods Basim A. Hassan; Ranen M. Sulaiman
Indonesian Journal of Electrical Engineering and Computer Science Vol 27, No 1: July 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v27.i1.pp86-91

Abstract

Unconstrained optimization problems, such as energy minimization, can be solved using the conjugate gradient method. For its major characteristic, the optimal formula conjugate encompasses all conjugate gradient algorithms. In conjugate gradient approaches, the formula conjugate is typically the focus point and it's playing a very important role for conjugate gradient approaches. To offer the essential descent criteria in this work, we devised a novel formula based on the second order Taylor which have the descent property too. Our research focused on our suggested method's-convergence property with Wolfe condition is established and numerical performance. Comparison to FR-method, the new algorithem shows significant improvement in numerical results.
Two-versions of descent conjugate gradient methods for large-scale unconstrained optimization Hawraz N. Jabbar; Basim A. Hassan
Indonesian Journal of Electrical Engineering and Computer Science Vol 22, No 3: June 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v22.i3.pp1643-1649

Abstract

The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.
A new class of self-scaling for quasi-newton method based on the quadratic model Basim A. Hassan; Ranen M. Sulaiman
Indonesian Journal of Electrical Engineering and Computer Science Vol 21, No 3: March 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v21.i3.pp1830-1836

Abstract

Quasi-Newton method is an efficient method for solving unconstrained optimization problems. Self-scaling is one of the common approaches in the modification of the quasi-Newton method. A large variety of self-scaling of quasi-Newton methods is very well known. In this paper, based on quadratic function we derive the new self-scaling of quasi-Newton method and study the convergence property. Numerical results on the collection of problems showed the self-scaling of quasi-Newton methods which improves overall numerical performance for BFGS method.
A new hybrid conjugate gradient algorithm for optimization models and its application to regression analysis Ibrahim Mohammed Sulaiman; Norsuhaily Abu Bakar; Mustafa Mamat; Basim A. Hassan; Maulana Malik; Alomari Mohammad Ahmed
Indonesian Journal of Electrical Engineering and Computer Science Vol 23, No 2: August 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v23.i2.pp1100-1109

Abstract

The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed method to the famous statistical regression model describing the global outbreak of the novel COVID-19 is presented. The study parameterized the model using the weekly increase/decrease of recorded cases from December 30, 2019 to March 30, 2020. Preliminary numerical results on some unconstrained optimization problems show that the proposed method is efficient and promising. Furthermore, the proposed method produced a good regression equation for COVID-19 confirmed cases globally.
A new conjugate gradient algorithms using conjugacy condition for solving unconstrained optimization Aseel M. Qasim; Zinah F. Salih; Basim A. Hassan
Indonesian Journal of Electrical Engineering and Computer Science Vol 24, No 3: December 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v24.i3.pp1647-1653

Abstract

The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.
Using a new coefficient conjugate gradient method for solving unconstrained optimization problems Ranen M. Sulaiman; Basim A. Hassan
Indonesian Journal of Electrical Engineering and Computer Science Vol 27, No 3: September 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v27.i3.pp1642-1648

Abstract

The conjugate gradient technique is a numerical solution strategy for finding minimization in mathematics. We present a simple, straightforward, efficient, and resilient conjugate gradient technique in this study. To address the convergence difficulty and descent property, the new technique is built on the quadratic model. Under some assumptions, the new improved approach meets the convergence characteristics and the adequate descent criterion. The suggested unique strategy is substantially more efficient than the classic FR method, according to our numerical analysis. The number of function evaluations, iterations and restarts are all included in the numerical results. The computational efficiency of the proposed approach is proved by comparative results.
Improvement of conjugate gradient methods for removing impulse noise images Basim A. Hassan; Ali Ahmed A. Abdullah
Indonesian Journal of Electrical Engineering and Computer Science Vol 29, No 1: January 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v29.i1.pp245-251

Abstract

Optimization problems occur in most disciplines like engineering, physics, mathematics, economics, administration, commerce, social sciences, and even politics. The conjugate coefficient is the cornerstone of conjugate gradient algorithms with the desired conjugate property. In this study, we discovered fresh second order information for the Hessian from the target function, which might lead to a new search direction. Based on a unique search direction, we proposed the update formula and nonlinear conjugate gradient technique. Under Wolfe line search and moderate objective function assumptions, the strategy has acceptable descent property and is always globally convergent. According to numerical results, the technique is successful and competitive in recovering the original picture from an image corrupted by impulsive noise.
On image restoration problems using new conjugate gradient methods Basim A. Hassan; Haneen A. Alashoor
Indonesian Journal of Electrical Engineering and Computer Science Vol 29, No 3: March 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v29.i3.pp1438-1445

Abstract

The nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization since it has low storage and simple structure properties. The coefficient conjugate is the basis of conjugate gradient algorithms with the desirable conjugate property. In this manuscript, we have derived a new second order information for the Hessian from objective function, which can give a new search direction. Based on new search direction, we have proposed the update formula interesting and nonlinear conjugate gradient method. Under wolfe line search and mild assumptions on objective function, the method possess sufficient descent property and are always globally convergent. Numerical results show that the method is effective and competitive to recover the original image from an image corrupted by impulse noise.