CORRELATION WEIGHTED HETEROGENEOUS EUCLIDEAN-OVERLAP METRIC

Chaoqun Li and Hongwei Li

References

  1. [1] E. Frank, M. Hall, and B. Pfahringer, Locally weighted naive Bayes, Proc. Conf. on Uncertainty in Artificial Intelligence, Morgan Kaufmann, 2003, 249–256.
  2. [2] B. Wang and H. Zhang, Probability based metrics for locally weighted naive Bayes. Proc. 20th Canadian Conf. on Artifical Intelligence, 2007, 180–191.
  3. [3] R.D. Short, & K. Fukunaga, The optimal distance measure for nearest neighbour classification. IEEE Transactions on Information Theory, 27(5), 1981, 622–627.
  4. [4] J.P. Myles and D.J. Hand, The multi-class metric problem in nearest neighbour discrimination rules, Pattern Recognition, 23(11), 1990, 1291–1297.
  5. [5] E. Blanzieri and F. Ricci, Probability based metrics for nearest neighbor classification and case-based reasoning, Proc. 3rd International Conf. on Case-Based Reasoning and Development, Lecture Notes in Computer Science, 1650, 1999, 14–28.
  6. [6] C. Stanfill and D. Waltz, Toward memorybased reasoning. Communications of the ACM, 29, 1986, 1213–1228.
  7. [7] S. Cost and S. Salzberg, A weighted nearest neighbor algorithm for learning with symbolic features, Machine Learning, 10 (1), 1993, 57–78.
  8. [8] R. John, S. Kasif, S. Salzberg, and D.W. Aha, Towards a better understanding of memory-based and bayesian classifiers. Proc. 11th Conf. on International Machine Learning. New Brunswick, NJ: Morgan Kaufmann, 1994, 242–250.
  9. [9] D.R. Wilson and T.R. Martinez, Improved heterogeneous distance functions, Journal of Artificial Intelligence Research, 6 (1), 1997, 1–34.
  10. [10] P.N. Tan, M. Steinbach, and V. Kumar, Introduction to data mining, 1st ed. (Pearson Education, Inc: Boston, 2006).
  11. [11] J.G. Cleary and L.E. Trigg, K: An instance-based learner using an entropic distance measure. Proc. 12th International Machine Learning Conference, Tahoe City, CA, Morgan Kaufmann, 1995, 108–114. 345
  12. [12] H. Wang, Nearest neighbors by neighborhood counting, IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(6), 2006, 942–953.
  13. [13] D.W. Aha, Tolerating noisy, irrelevant, and novel attributes in instance-based learning algorithms, International Journal of Man-Machine Studies, 36 (2), 1992, 267–287.
  14. [14] D. Wettschereck and D.W. Aha, Weighting features, Proc. 1st International Conf. on Case-Based Reasoning Research and Development, Lecture Notes In Computer Science, SpringerVerlag, London, UK, 1010, 1995, 347–358.
  15. [15] R. Kohavi, P. Langley, and Y. Yun, The utility of feature weighting in nearest-neighbor algorithms. Poster Papers: 9th European Conf. on Machine Learning, Prague, Czech Republic, 1997. Unpublished.
  16. [16] G. John, R. Kohavi, and K. Perfleg, Irrelevant features and the subset selection problem. Proc. 11th international conf. on machine learning, Morgan Kaufmann, 1994, 121–129.
  17. [17] M.A. Hall, Correlation-based feature selection for discrete and numeric class machine learning, Proc. 17th International Conf. on Machine Learning, 2000, 359–366.
  18. [18] Y. Lei and H. Liu, Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research, 5, 2004, 1205–1224.
  19. [19] H. Peng, F. Long, and C. Ding, Feature selection based on mutual information: Criteria of max-dependency, maxrelevance, and min-redundancy, IEEE Trasaction on Pattern Analysis and Machine Intelegence, 27 (8), 2005, 1226–1238.
  20. [20] D.R. Wilson, Advances in instance-based learning algorithms, Doctoral Dissertation, Brigham Young University Provo, UT, 1997.
  21. [21] N. Friedman, D. Geiger, and M. Goldszmidt, Bayesian network classifiers, Machine Learning, 29, 1997, 131–161.
  22. [22] E.H. Han, G. Karypis, and V. Kumar, Text categorization using weight adjusted k-nearest neighbor classification, Technical report, Department of CS, University of Minnesota, 1999.
  23. [23] L. Jiang, D. Wang, Z. Cai, S. Jiang, and X. Yan. Scaling up the accuracy of k-nearest-neighbor classifiers: A naive-Bayes hybrid, International Journal of Computers and Applications, 31 (1), 2009, 36–43.
  24. [24] C. Li, L. Jiang, and J. Wu, Distance and attribute weighted k-nearest-neighbor and its application in reservoir porosity prediction. Journal of Information and Computation Science, 6 (2), 2009, 845–851.
  25. [25] W.H. Press, S.A. Teukolsky, W.T. Vetterling, and B.P. Flannery, Numerical Recipes in C, 2nd ed., (Cambridge University Press: Cambridge, 1988).
  26. [26] I.H. Witten and E. Frank, Data mining: Practical machine learning tools and techniques, 2nd ed. (San Francisco: Morgan Kaufmann, CA, 2005).
  27. [27] C. Nadeau, and Y. Bengio, Inference for the generalization error, Machine Learning, 52 (3), 2003, 239–281.

Important Links:

Go Back