k-Nearest Neighbor (k-NN) is one of the classification algorithms which becomes top 10 in data mining. k-NN is simple and easy to apply. However, the classification results are greatly influenced by the scale of the data input. All of its attributes are considered equally important by Euclidean distance, but inappropriate with the relevance of each attribute. Thus, it makes classification results decreased. Some of the attributes are more or less relevance or, in fact, irrelevant in determining the classification results. To overcome the disadvantage of k-NN, Zolghadri, Parvinnia, and John proposed Weighted Distance Nearest Neighbor (WDNN) having the performance better than k-NN. However, when the result is k >1, the performance decrease. Gou proposed Dual Distance Weighted Voting k-Nearest Neighbor (DWKNN) having the performance better than k-NN. However, DWKNN focused in determining label of classification result by weighted voting. It applied Euclidean distance without attribute weighting. This might cause all attribute considered equally important by Euclidean distance, but inappropriate with the relevance of each attribute, which make classification results decreased. This research proposed Coefficient of Variation Weighting k-Nearest Neighbor (CVWKNN) integrating with MinMax normalization and weighted Euclidean distance. Seven public datasets from UCI Machine Learning Repository were used in this research. The results of Friedman test and Nemenyi post hoc test for accuracy showed CVWKNN had better performance and significantly different compared to k-NN algorithm.