L. Jiang, Z. Cai, and D. Wang


Naive Bayes, instance weighting, combined learning, class probability estimation


Naive Bayes (NB) is one of the widely used algorithms for classification. However, its conditional independence assumption harms its performance to some extent. Thus, many algorithms are presented to improve its classification accuracy. In this paper, we single out another two improved algorithms: instance weighted naive Bayes (IWNB) and combined neighbourhood naive Bayes (CNNB). In IWNB, each training instance is firstly weighted according to the similarity between it and the mode of the training instances, and then a NB classifier is built on the weighted training instances. In CNNB, multiple NB are firstly built on multiple neighbourhoods with different radius values for a test instance, and then their class probability estimates are averaged to estimate the class probability of the test instance. We experimentally tested IWNB and CNNB using the whole 36 University of California, Irvine (UCI) data sets selected by Weka, and compared them with NB. The experimental results show that IWNB and CNNB all significantly outperform NB.

Important Links:

Go Back