RANDOM FOREST ANALYSIS ON DIABETES COMPLICATION DATA

Punnee Sittidech, Nongyao Nai-arun

View Full Paper

References

  1. [1] International Diabetes Federation. (2013). RetrievedMarch 12, 2013, Available:http://www.idf.org/diabetesatlas/5e/the-global burden.
  2. [2] World Health Organization. (2014). Retrieved January5, 2014, Available:http://www.who.int/diabetes/action_online/basics/en/index3.htm
  3. [3] J. Han, M. Kamber, and J. Pei, Data Mining: Conceptsand Techniques 3rded (USA: Morgan Kaufman Publishers,2012).
  4. [4] P-N. Tan, M. Steinbach and V. Kumar, Introduction toData Mining (Addison Wesley, 2006).
  5. [5] L. Breiman, Bagging Predictors. Machine Learning,24, 1996, 123-140.
  6. [6] T. G. Dietterich, An Experimental Comparison ofThree Methods for Constructing Ensembles of DecisionTrees: Bagging, Boosting, and Randomization, MachineLearning, 40, 2000, 139–157.
  7. [7] L. Breiman, Random Forests. Machine. Learning. 45,2001, 5–32. DOI 10.1023/A: 1010933404324
  8. [8] R. A. Caruana and D. Freitag. How Useful isRelevance? Technical report, in Fall’94 AAAI Symposiumon Relevance, New Orleans, 1994.
  9. [9] K. Selvakuberanet, M. Indradevi, and R. Rajaram,Combined Feature Selection and classification: A novelapproach for the categorization of web pages, Journal ofInformation and Computing Science, 3(2), 2008, 083-089.
  10. [10] H. C. Yang and C. H. Lee, A Text Mining Approachon Automatic Generation of Web Directories andHierarchies, Proc. IEEE/WIC International Conference onWeb Intelligence (WI’03), 2003.
  11. [11] Y. Yimingand and O. P. Jan, Comparative Study offeature selection in Text Categorization, Proc. 14thInternational Conference on Machine Learning(ICML’97), 1997, 412-420.
  12. [12] J. R. Quinlan, Induction of Decision Tree (Reading inMachining Learning, 1986).
  13. [13] A. L. Symeonidis and P. A. Mitkas, AgentIntelligence Through Data mining (USA: Springer Scienceand Business Media, 2005).
  14. [14] S. B. Kotsiantis, Supervised Machine Learning: AReview of Classification Techniques, Informatica , 31,2007, 249-268.
  15. [15] C. S. Sang, Practical Applications of Data Mining(USA: Jones & Bartlett Publishers, 2012).
  16. [16] J. Ali, R. Khan, N. Ahmad and I. Maqsood, RandomForests and Decision Trees, IJCSI International Journal ofComputer Science Issues, 9(5), No 3, 2012.
  17. [17] I. H. Witten and E. Frank, Data Mining: PracticalMachine Learning Tools and Techniques, 2nded (USA:Morgan Kaufmann Publishers, 2005).
  18. [18] K. Machová, F. Barčák and P. Bednár, A BaggingMethod using Decision Trees in the Role of BaseClassifiers, in Acta Polytechnical Hungarica, 3(2), 2006.
  19. [19] C. X. Ling and V. S. Sheng, Cost-Sensitive Learningand the Class Imbalance Proble, Encyclopedia of MachineLearning, C.Sammut (Ed.), Springer, Canada. 2008.319
  20. [20] G. Biau, Analysis of a Random Forests Model,Journal of Machine Learning Research, 13, 2012, 1063-1095.
  21. [21] P. Geurts et al., Proteomic mass spectra classificationusing decision tree based ensemble methods,Bioinformatics, 21(15), 2005, 3138–3145.
  22. [22] S. Chakrabarti, E. Cox, E. Frank, R. H. Guting, J. Han,X. Jiang, M. Kamber, S. Lightstone, S. Nadeau, T. P.Neapolitan, R. E. Pyle, D. Refaat, M. Schneider, T. J.Teorey, and I. H. Witten. Data Mining: Know It All (USA:Morgan Kaufmann Publishers, 2008).
  23. [23] A. G. K. Janecek, W. N. Gansterer, M. A. Demel, andG. F. Ecker, On the Relationship Between FeatureSelection and Classification Accuracy, JMLR Workshopand Conference, 4, 2008, 90-105.
  24. [24] L. Ladha and T. Deepa, Feature Selection Methodsand Algorithms, 2011.
  25. [25] B. Krishnapuram et al, A Bayesian Approach to JointFeature Selection and Classifier Design, IEEETransactions on Pattern Analysis and MachineIntelligence, 26(9), 2004, 1105 – 1111.
  26. [26] Y. Huang, P. McCullagh, N. Black, and R. Harper,Feature Selection and Classification Model Constructionon type 2 Diabetic Patients’ data, Artificial Intelligence inMedicine, 41, 2007, 251-262.
  27. [27] G. K. Asha, A. S. Manjunath, and M. A. Jayaram,Comparative Study Of Attribute Selection Using GainRatio And Correlation Based Feature Selection,International Journal of Information Technology andKnowledge Management, 2(2), 2010, 271-277.

Important Links:

Go Back