Tan Wei Hong
Universiti Malaysia Perlis

Published : 3 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 3 Documents
Search

An improved radial basis function networks based on quantum evolutionary algorithm for training nonlinear datasets Lim Eng Aik; Tan Wei Hong; Ahmad Kadri Junoh
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 8, No 2: June 2019
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (589.412 KB) | DOI: 10.11591/ijai.v8.i2.pp120-131

Abstract

In neural networks, the accuracies of its networks are mainly relying on two important factors which are the centers and spread value. Radial basis function network (RBFN) is a type of feedforward network that capable of perform nonlinear approximation on unknown dataset. It has been widely used in classification, pattern recognition, nonlinear control and image processing. Thus, with the increases in RBFN application, some problems and weakness of RBFN network is identified. Through the combination of quantum computing and RBFN provides a new research idea in design and performance improvement of RBFN system. This paper describes the theory and application of quantum computing and cloning operators, and discusses the superiority of these theories and the feasibility of their optimization algorithms.This proposed improved RBFN (I-RBFN) that combined with cloning operator and quantum computing algorithm demonstrated its ability in global search and local optimization to effectively speed up learning and provides better accuracy in prediction results. Both the algorithms that combined with RBFN optimize the centers and spread value of RBFN. The proposed I-RBFN was tested against the standard RBFN in predictions. The experimental models were tested on four literatures nonlinear function and four real-world application problems, particularly in Air pollutant problem, Biochemical Oxygen Demand (BOD) problem, Phytoplankton problem, and forex pair EURUSD. The results are compared to I-RBFN for root mean square error (RMSE) values with standard RBFN. The proposed I-RBFN yielded better results with an average improvement percentage more than 90 percent in RMSE.
Distance weighted K-Means algorithm for center selection in training radial basis function networks Lim Eng Aik; Tan Wei Hong; Ahmad Kadri Junoh
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 8, No 1: March 2019
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (606.31 KB) | DOI: 10.11591/ijai.v8.i1.pp54-62

Abstract

The accuracies rates of the neural networks mainly depend on the selection of the correct data centers. The K-means algorithm is a widely used clustering algorithm in various disciplines for centers selection. However, the method is known for its sensitivity to initial centers selection. It suffers not only from a high dependency on the algorithm's initial centers selection but, also from data points. The performance of K-means has been enhanced from different perspectives, including centroid initialization problem over the years. Unfortunately, the solution does not provide a good trade-off between quality and efficiency of the centers produces by the algorithm. To solve this problem, a new method to find the initial centers and improve the sensitivity to the initial centers of K-means algorithm is proposed. This paper presented a training algorithm for the radial basis function network (RBFN) using improved K-means (KM) algorithm, which is the modified version of KM algorithm based on distance-weighted adjustment for each centers, known as distance-weighted K-means (DWKM) algorithm. The proposed training algorithm, which uses DWKM algorithm select centers for training RBFN obtained better accuracy in predictions and reduced network architecture compared to the standard RBFN. The proposed training algorithm was implemented in MATLAB environment; hence, the new network was undergoing a hybrid learning process. The network called DWKM-RBFN was tested against the standard RBFN in predictions. The experimental models were tested on four literatures nonlinear function and four real-world application problems, particularly in Air pollutant problem, Biochemical Oxygen Demand (BOD) problem, Phytoplankton problem, and forex pair EURUSD. The results are compared to proposed method for root mean square error (RMSE) in radial basis function network (RBFN). The proposed method yielded a promising result with an average improvement percentage more than 50 percent in RMSE.
An improved radial basis function networks in networks weights adjustment for training real-world nonlinear datasets Lim Eng Aik; Tan Wei Hong; Ahmad Kadri Junoh
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 8, No 1: March 2019
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | Full PDF (1118.074 KB) | DOI: 10.11591/ijai.v8.i1.pp63-76

Abstract

In neural networks, the accuracies of its networks are mainly relying on two important factors which are the centers and the networks weight. The gradient descent algorithm is a widely used weight adjustment algorithm in most of neural networks training algorithm. However, the method is known for its weakness for easily trap in local minima. It suffers from a random weight generated for the networks during initial stage of training at input layer to hidden layer networks. The performance of radial basis function networks (RBFN) has been improved from different perspectives, including centroid initialization problem to weight correction stage over the years. Unfortunately, the solution does not provide a good trade-off between quality and efficiency of the weight produces by the algorithm. To solve this problem, an improved gradient descent algorithm for finding initial weight and improve the overall networks weight is proposed. This improved version algorithm is incorporated into RBFN training algorithm for updating weight. Hence, this paper presented an improved RBFN in term of algorithm for improving the weight adjustment in RBFN during training process. The proposed training algorithm, which uses improved gradient descent algorithm for weight adjustment for training RBFN, obtained significant improvement in predictions compared to the standard RBFN. The proposed training algorithm was implemented in MATLAB environment. The proposed improved network called IRBFN was tested against the standard RBFN in predictions. The experimental models were tested on four literatures nonlinear function and four real-world application problems, particularly in Air pollutant problem, Biochemical Oxygen Demand (BOD) problem, Phytoplankton problem, and forex pair EURUSD. The results are compared to IRBFN for root mean square error (RMSE) values with standard RBFN. The IRBFN yielded a promising result with an average improvement percentage more than 40 percent in RMSE.