Chaoqun Li and Hongwei Li


  1. [1] J.R. Quinlan, Learning with continuous classes, Proc. 5th Aus-tralian Joint Conf. Artificial Intelligence, Hobart, Australia,1992, 343–348.
  2. [2] Y. Wang and I.H. Witten, Induction of model trees for predict-ing continuous classes, Proc. Poster Papers of the EuropeanConference on Machine learning, 1997, 128–137.
  3. [3] R. Setiono and W.K. Leow, Pruned neural networks for re-gression. Proc. 6th Biennial Pacific Rim Int. Conf. Artifi-cial Intelligence, PRICAI 2000, LNAI 1886, Springer Press,500–509.
  4. [4] S.K. Shevade, S.S. Keerthi, C. Bhattacharyya, and K.R.K.Murthy, Improvements to SMO algorithm for SVM regression,technical report CD-99-16, Control Division, Department ofMechanical and Production Engineering, National Universityof Singapore, Singapore, 1999.
  5. [5] C. Li, Instance weighted linear regression, Journal of Compu-tational Information Systems, 4 (6), 2008, 2395–2402.
  6. [6] T.M. Mitchell, Instance-based learning, Chapter 8, MachineLearning (New York: McGraw-Hill, 1997).
  7. [7] C. Li and L. Jiang, Using locally weighted learning to im-prove SMOreg for regression, Proc. 9th Biennial Pacific RimInt. Conf. Artificial Intelligence, PRICAI 2006, LNAI 4099,Springer Press, 375–384.
  8. [8] L. Jiang, H. Zhang, D. Wang, and Z. Cai, Learning locallyweighted C4.4 for class probability estimation, Proc. 10th Int.Conf. Discovery Science, DS 2007, LNAI 4755, Springer Press,104–115.
  9. [9] Q. Wang, L. Zhang, M.M. Chi, and J.K. Guo, MTForest:ensemble decision trees based on multi-task learning, Proc.18th European Conf. Artificial Intelligence, 2008, 122–126.
  10. [10] L. Breaiman, J.H. Freidman, R.A. Olshen, and C.J. Stone,Classification and Regression trees (Wadsworth, Belmont, CA,1984).
  11. [11] N. Landwehr, M. Hall, and E. Frank, Logistic model trees,Machine Learning, 59 (1–2), 2005, 161–205.
  12. [12] P.N. Tan, M. Steinbach, and V. Kumar, Classification: al-ternative techniques, Chapter 5, Introduction to data mining(Boston: Pearson Education, Inc., 2006).
  13. [13] L. Breiman, Bagging predictors, Machine Learning, 24 (2),1996, 123–140.
  14. [14] Y. Freund and R.E. Schapire, Experiments with a new boostingalgorithm, in L. Saitta (Ed.), Proc. Thirteenth Int. Conf.Machine Learning (San Francisco: Morgan Kaufmann, Bari,Italy), 1996, 148–156.
  15. [15] T.G. Dietterich, An experimental comparison of three methodsfor constructing ensembles of decision trees: bagging, boostingand randomization, Machine learning, 40, 1998, 139–157.
  16. [16] T.K. Ho, The random subspace method for constructing de-cision forests, IEEE Transactions on Pattern Analysis andMachine Intelligence, 20 (8), 1998, 832–844.
  17. [17] L. Breiman, Random forests, Machine Learning, 45, 2001,5–32.
  18. [18] I.H. Witten and E. Frank, Data mining: practical machinelearning tools and techniques, 2nd ed. (San Francisco: MorganKaufmann, 2005)
  19. [19] C. Nadeau and Y. Bengio, Inference for the generalizationerror, Machine Learning, 52 (3), 2003, 239–281.
  20. [20] L. Jiang, D. Wang, Z. Cai, and X. Yan, Survey of improvingnaive bayes for classification, Proc. 3rd Int. Conf. AdvancedData Mining and Applications, ADMA 2007, LNAI 4632,Springer Press, 134–145.
  21. [21] L. Jiang, C. Li, J. Wu, and J. Zhu, A combined classificationalgorithm based on C4.5 and NB, Proc. 3rd Int. Symp. Intel-ligent Computation and its Applications, ISICA 2008, LNCS5370, Springer Press, 350–359.
  22. [22] L. Jiang, D. Wang, Z. Cai, S. Jiang, and X. Yan, Scaling upthe accuracy of K-nearest-neighbor classifiers: a naive-bayeshybrid, International Journal of Computers and Applications,31 (1), 2009, 36–43.

Important Links:

Go Back