SCALING UP THE ACCURACY OF K -NEAREST-NEIGHBOUR CLASSIFIERS: A NAIVE-BAYES HYBRID

L. Jiang, D. Wang, Z. Cai, S. Jiang, and X. Yan

Keywords

k-nearest-neighbour, naive Bayes, distance function, neighbourhood size, class probability estimation.

Abstract

k-nearest-neighbour (KNN) has been widely used as an effective classification model. In this paper, we summarize three main shortcomings confronting KNN and then single out three categories of approaches for overcoming its three main shortcomings. After reviewing some algorithms in each category, we presented a hybrid algorithm called dynamic k-nearest-neighbour naive Bayes with attribute weighting (simply DKNAW) by combining three improved approaches. We conduct extensive empirical comparison for the related algorithms in four groups, using the whole 36 UCI data sets selected by Weka. In the first three groups, we compare some algorithms in each category accordingly. In the forth group, we compare our hybrid approach to each single approach. At last, we discuss some directions for our future work on KNN

Important Links:



Go Back