Abstract
When predictive modeling requires comprehensible models, most data miners will use specialized techniques producing rule sets or decision trees. This study, however, shows that genetically evolved decision trees may very well outperform the more specialized techniques. The proposed approach evolves a number of decision trees and then uses one of several suggested selection strategies to pick one specific tree from that pool. The inherent inconsistency of evolution makes it possible to evolve each tree using all data, and still obtain somewhat different models. The main idea is to use these quite accurate and slightly diverse trees to form an imaginary ensemble, which is then used as a guide when selecting one specific tree. Simply put, the tree classifying the largest number of instances identically to the ensemble is chosen. In the experimentation, using 25 UCI data sets, two selection strategies obtained significantly higher accuracy than the standard rule inducer J48.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)
Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth International Group (1984)
Tsakonas, A.: A comparison of classification accuracy of four genetic programming-evolved intelligent structures. Information Sciences 176(6), 691–724 (2006)
Bojarczuk, C.C., Lopes, H.S., Freitas, A.A.: Data Mining with Constrained-syntax Genetic Programming: Applications in Medical Data Sets. In: Intelligent Data Analysis in Medicine and Pharmacology - a workshop at MedInfo-2001 (2001)
Johansson, U., König, R., Niklasson, L.: Inconsistency - Friend or Foe. In: International Joint Conference on Neural Networks, pp. 1383–1388. IEEE Press, Los Alamitos (2007)
Provost, F., Domingos, P.: Tree induction for probability-based ranking. Machine Learning 52, 199–215 (2003)
Johansson, U., Sönströd, C., Löfström, T., König, R.: Using Genetic Programming to Obtain Implicit Diversity. In: IEEE Congress on Evolutionary Computation, pp. 2454–2459. IEEE Press, Los Alamitos (2009)
Breiman, L.: Bagging Predictors. Machine Learning 24(2), 123–140 (1996)
Dietterich, T.G.: Machine learning research: four current directions. The AI Magazine 18, 97–136 (1997)
Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, San Mateo, CA, vol. 2, pp. 650–659. Morgan Kaufmann, San Francisco (1995)
Witten, I.H., Frank, E.: Data Mining: Practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases, University of California, Department of Information and Computer Science (1998)
Demšar, J.: Statistical Comparisons of Classifiers over Multiple Data Sets. Journal of Machine Learning Research 7, 1–30 (2006)
Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. Journal of American Statistical Association 32, 675–701 (1937)
Nemenyi, P.B.: Distribution-free multiple comparisons, PhD thesis, Princeton University (1963)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Johansson, U., König, R., Löfström, T., Niklasson, L. (2010). Using Imaginary Ensembles to Select GP Classifiers. In: Esparcia-Alcázar, A.I., Ekárt, A., Silva, S., Dignum, S., Uyar, A.Ş. (eds) Genetic Programming. EuroGP 2010. Lecture Notes in Computer Science, vol 6021. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12148-7_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-12148-7_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-12147-0
Online ISBN: 978-3-642-12148-7
eBook Packages: Computer ScienceComputer Science (R0)