Abstract
A new approach to the induction of multivariate decision trees is proposed. A linear decision function (hyper-plane) is used at each non-terminal node of a binary tree for splitting the data. The search strategy is based on the dipolar criterion functions and exploits the basis exchange algorithm as an optimization procedure. The feature selection is used to eliminate redundant and noisy features at each node. To avoid the problem of over-fitting the tree is pruned back after the growing phase. The results of experiments on some real-life datasets are presented and compared with obtained by state-of-art decision trees.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Blake, C., Merz, C.: UCI Repository of machine learning databases, available on-line: http://www.ics.uci.edu/mlearn/MLRepository.html (1998)
Bobrowski, L.: Design of piecewise linear classifiers from formal neurons by some basis exchange technique. Pattern Recognition 24(9) (1991) 862–870
Bobrowski, L.: Piecewise-linear classifiers, formal neurons and separability of the learning sets, In: Proc. of 13th Int. Conf. on Pattern Recognition (1996) 224–228
Bobrowski, L.: Data mining procedures related to the dipolar criterion function, In: Applied Stochastic Models and Data Analysis(1999) 43–50.
Breiman, L., Friedman, J., Olshen, R., Stone C.: Classification and Regression Trees. Wadsworth Int. Group (1984)
Brodley, C., Utgoff, P.: Multivariate decision trees. Mach. Learning 19 (1995) 45–77
Chai, B., Huang, T., Zhuang, X., Zhao, Y., Sklansky, J., Piecewise-linear classifiers using binary tree structure and genetic algorithm. Pattern Recognition 29(11) (1996) 1905–1917
Duda, O.R., Heart, P.E.: Pattern Classification and Scene Analysis. J. Wiley (1973)
Gama, J., Brazdil, P.: Linear tree. Inteligent Data Analysis 3(1) (1999) 1–22
Kass, G. V.: An exploratory technique for investigating large quantities of catego-rical data, Applied Statistics 29(2) (1980) 119–127
Lim, T., Loh, W., Shih, Y.: A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Mach. Learning 40(3) (2000) 203–228
Murthy, S., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2 (1994) 1–33
Murthy, S.: Automatic construction of decision trees from data: A multi-disciplinary survey. Data Mining and Knowledge Discovery 2(1998) 345–389
Quinlan, J.: C4.5: Programs for Machine Learning. Morgan Kaufmann (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bobrowski, L., Kretowski, M. (2000). Induction of Multivariate Decision Trees by Using Dipolar Criteria. In: Zighed, D.A., Komorowski, J., Żytkow, J. (eds) Principles of Data Mining and Knowledge Discovery. PKDD 2000. Lecture Notes in Computer Science(), vol 1910. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45372-5_33
Download citation
DOI: https://doi.org/10.1007/3-540-45372-5_33
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41066-9
Online ISBN: 978-3-540-45372-7
eBook Packages: Springer Book Archive