Nor Haizan Mohd Radzi
Universiti Teknologi Malaysia

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Near Optimal Convergence of Back-Propagation Method using Harmony Search Algorithm Abdirashid Salad Nur; Nor Haizan Mohd Radzi; Siti Mariyam Shamsuddin
Indonesian Journal of Electrical Engineering and Computer Science Vol 14, No 1: April 2015
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

Training Artificial Neural Networks (ANNs) is of great significanceand a difficult task in the field of supervised learning as its performance depends on underlying training algorithm as well as the achievement of the training process. In this paper, three training algorithms namely Back-Propagation Algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of Multi-Layer Perceptron feed  forward type of Neural Networks (NNs)  by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA-MLP, HSA-MLP and BP-MLP. The proposed method is empirically tested and verified using five benchmark classification problemswhich are Iris, Glass, Cancer, Wine and Thyroid dataset on training NNs. The MSE, training time, and classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed method has better results in terms of convergence error and classification accuracy compared to BP-MLP and HSA-MLPmaking the BPHSA-MLPa promising algorithm for neural network training. DOI: http://dx.doi.org/10.11591/telkomnika.v14i1.7233
Artificial Neural Network Weight Optimization: A Review Abdirashid Salad Nur; Nor Haizan Mohd Radzi; Ashraf Osman Ibrahim
Indonesian Journal of Electrical Engineering and Computer Science Vol 12, No 9: September 2014
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v12.i9.pp6897-6902

Abstract

Optimizing the weights of Artificial Neural Networks (ANNs) is a great important of a complex task in the research of machine learning due to dependence of its performance to the success of learning process and the training method. This paper reviews the implementation of meta-heuristic algorithms in ANNs’ weight optimization by studying their advantages and disadvantages giving consideration to some meta-heuristic members such as Genetic algorithim, Particle Swarm Optimization and recently introduced meta-heuristic algorithm called Harmony Search Algorithm (HSA). Also, the application of local search based algorithms to optimize the ANNs weights and their benefits as well as their limitations are briefly elaborated. Finally, a comparison between local search methods and global optimization methods is carried out to speculate the trends in the progresses of ANNs’ weight optimization in the current resrearch.