Abstract
This paper proposes two novel ensemble algorithms for training support vector machines based on constraint projection technique and selective ensemble strategy. Firstly, projective matrices are determined upon randomly selected must-link and cannot-link constraint sets, with which original training samples are transformed into different representation spaces to train a group of base classifiers. Then, two selective ensemble techniques are used to learn the best weighting vector for combining them, namely genetic optimization and minimizing deviation errors respectively. Experiments on UCI datasets show that both proposed algorithms improve the generalization performance of support vector machines significantly, which are much better than classical ensemble algorithms, such as Bagging, Boosting, feature Bagging and LoBag.
Supported by National Natural Science Foundation of China(69732010) and Scientific Research of Southwestern University of Finance and Economics (QN0806).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Dietterich, T.G.: Machine learning research: four current directions. AI Magazine 18, 97–136 (1997)
Krogh, A., Vedelsby, J.: Neural Network Ensembles, Cross Validation, and Active Learning. In: Advances in Neural Information Processing Systems, pp. 231–238 (1995)
Kuncheva, L.: Combing Pattern Classifier: Methods and Algorithm. John wiley and Sons, Chichester (2004)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Dong, Y.S., Han, K.S.: A Comparison of Several Ensemble Methods for Text Categorization. In: IEEE Int. Conf. on Services Computing, pp. 419–422. IEEE Press, Shanghai (2004)
Tao, D.C., Tang, O.X.: Asymmetric Bagging and Random Subspace for Support Vector Machines-based Relevance Feedback in Image Retrieval. IEEE Trans. on Pat. Ana. and Mach. Intel. 28, 1088–1099 (2006)
Valentini, G., Dietterich, T.: Bias-variance Analysis of Support Vector Machines for the Development of SVM-based Ensemble Methods. J. of Mach. Learn. Res., 725–775 (2004)
Basu, S., Banerjee, A., Mooney, R.J.: Active Semi-supervision for Pairwise Constrained Clustering. In: Proc. of the SIAM Int. Conf. on Data Mining, Lake Buena Vista, Florida, USA, pp. 333–344 (2004)
Zhou, Z.H., Wu, J., Tang, W.: Ensembling Neural Networks: Many could be Better than All. Artif. Intel. 137, 239–263 (2002)
Dietterich, T.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and randomization. Mach. Learn. 40, 139–158 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, L., Yang, Y. (2009). Selective Ensemble Algorithms of Support Vector Machines Based on Constraint Projection. In: Yu, W., He, H., Zhang, N. (eds) Advances in Neural Networks – ISNN 2009. ISNN 2009. Lecture Notes in Computer Science, vol 5552. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01510-6_33
Download citation
DOI: https://doi.org/10.1007/978-3-642-01510-6_33
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-01509-0
Online ISBN: 978-3-642-01510-6
eBook Packages: Computer ScienceComputer Science (R0)