This Author published in this journals
All Journal Jurnal Ilmiah Sinus
Catur Supriyanto
Teknik Informatika, Universitas Dian Nuswantoro Semarang

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Metode Pembobotan Jarak dengan Koefisien Variasi untuk Mengatasi Kelemahan Euclidean Distance pada Algoritma k-Nearest Neighbor Agustiyar Agustiyar; Romi Satria Wahono; Catur Supriyanto
Jurnal Ilmiah SINUS Vol 20, No 1 (2022): Vol. 20 No. 1, Januari 2022
Publisher : STMIK Sinar Nusantara

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30646/sinus.v20i1.565

Abstract

k-Nearest Neighbor (k-NN) is one of the classification algorithms which becomes top 10 in data mining. k-NN is simple and easy to apply. However, the classification results are greatly influenced by the scale of the data input. All of its attributes are considered equally important by Euclidean distance, but inappropriate with the relevance of each attribute. Thus, it makes classification results decreased. Some of the attributes are more or less relevance or, in fact, irrelevant in determining the classification results. To overcome the disadvantage of k-NN, Zolghadri, Parvinnia, and John proposed Weighted Distance Nearest Neighbor (WDNN) having the performance better than k-NN. However, when the result is k >1, the performance decrease. Gou proposed Dual Distance Weighted Voting k-Nearest Neighbor (DWKNN) having the performance better than k-NN. However, DWKNN focused in determining label of classification result by weighted voting. It applied Euclidean distance without attribute weighting. This might cause all attribute considered equally important by Euclidean distance, but inappropriate with the relevance of each attribute, which make classification results decreased. This research proposed Coefficient of Variation Weighting k-Nearest Neighbor (CVWKNN) integrating with MinMax normalization and weighted Euclidean distance. Seven public datasets from UCI Machine Learning Repository were used in this research. The results of Friedman test and Nemenyi post hoc test for accuracy showed CVWKNN had better performance and significantly different compared to k-NN algorithm.