Abstract
Bagging as well as other classifier ensembles have made possible a performance improvement in many pattern recognition problems for the last decade. A careful analysis of previous work points out, however, that the most significant advance of bagged neural networks is achieved for multiclass problems, whereas the binary classification problems seldom benefit from the classifier combination. Focusing on the binary classification applications, this paper evaluates the standard bagging approach and explores a novel variant, local bagging, that, while keeping the standard individual classifier generation, attempts to improve its decision combination stage by (a) dynamically selecting of a set of individual classifiers and (b) subsequently weighting them by their local accuracy. Experimental results carried out on standard benchmark data sets with Neural Networks, SVMs, Naive Bayes, C4.5 Decision Trees and Decision Stumps as the base classifier, show that local bagging yields significant improvements in these practical applications and is found more stable than Adaboost.
This work has been partially supported by the Spanish MEC project DPI2006-02550.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Alaiz-Rodríguez, R., Guerrero-Curieses, A., Cid-Sueiro, J.: Minimax classifiers based on neural networks. Pattern Recognition 38(1), 29–39 (2005)
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)
Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40(2), 139–157 (2000)
Freund, Y.: Boosting a weak learning algorithm by majority. Inf. Comput. 121(2), 256–285 (1995)
Ha, K., Cho, S., MacLachlan, D.: Response models based on bagging neural networks. Journal of Interactive Marketing 19(1), 17–30 (2005)
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)
Hothorn, T., Lausen, B.: Bagging tree classifiers for laser scanning images: a data- and simulation-based strategy. Artificial Intelligence in Medicine 27(1), 65–79 (2003)
Kim, H.C., Pang, S., Je, H.M., Kim, D., Bang, S.Y.: Pattern classification using support vector machine ensemble. In: 16th International Conference on Pattern Recognition, p. 20160. IEEE Computer Society Press, Los Alamitos (2002)
Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E.: Local bagging of decision stumps. In: Proceedings of the 18th international conference on Innovations in Applied Artificial Intelligence, pp. 406–411. Springer, London (2005)
Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley-Interscience, Chichester (2004)
Maclin, R.: Boosting classifiers regionally. In: Proceedings of the fifteenth AAAI/ tenth IAAI, pp. 700–705 (1998)
Opitz, D., Maclin, R.: Popular ensemble methods: An empirical study. Journal of Artificial Intelligence Research 11, 169–198 (1999)
Opitz, D.W., Maclin, R.F.: An empirical evaluation of bagging and boosting for artificial neural networks. In: International Conference on Neural Networks, pp. 1401–1405 (1997)
Schapire, R.E.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)
Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco (1999)
Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Trans. Pattern Anal. Mach. Intell. 19(4), 405–410 (1997)
Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1-2), 239–263 (2002)
Zhou, Z.H., Tang, W.: Selective ensemble of decision trees. In: Wang, G., Liu, Q., Yao, Y., Skowron, A. (eds.) RSFDGrC 2003. LNCS (LNAI), vol. 2639, pp. 476–483. Springer, Heidelberg (2003)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Alaiz-Rodríguez, R. (2008). Local Decision Bagging of Binary Neural Classifiers. In: Bergler, S. (eds) Advances in Artificial Intelligence. Canadian AI 2008. Lecture Notes in Computer Science(), vol 5032. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-68825-9_1
Download citation
DOI: https://doi.org/10.1007/978-3-540-68825-9_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-68821-1
Online ISBN: 978-3-540-68825-9
eBook Packages: Computer ScienceComputer Science (R0)