Abstract
Many studies on ensemble learning that combines multiple classifiers have shown that, it is an effective technique to improve accuracy and stability of a single classifier. In this paper, we propose a novel discriminative classifier fusion method, which applies local classification results of classifiers among nearest neighbors to build a local classifier ensemble. From this dynamically selected process, discriminative classifiers are weighted heavily to build a locally discriminative ensemble. Experimental results on several UCI datasets have shown that, our proposed method achieves best classification performance among individual classifiers, majority voting and AdaBoost algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Boucheron, S., Bousquet, O., Lugosi, G.: Theory of classification: a survey of some recent advances. ESAIM: Probab. Stat. 9(1), 323–375 (2005)
Egmont-Petersen, M., de Ridder, D., Handels, H.: Image processing with neural networks C a review. Pattern Recogn. 35(10), 2279–2301 (2002)
Wang, L.: The Support Vector Machines: Theory and Applications. Springer, Berlin (2005)
Quinlan, J.R.: Improved use of continuous attributes in C4.5. J. Artif. Intell. Res. 4, 77–90 (1996)
Rennie, J., Shih, L., Teevan, J., Karger, D.: Spam filtering with Naive Bayes which Naive Bayes? In: Proceedings of the Twentieth International Conference on Machine Learning (ICML), pp. 285–295 (2003)
Britto Jr., A.S., sabourin, R., Oliveira, L.S.: Dynamic selection of classifiers - a comprehensive review. Pattern Recogn. 47(11), 3665–3680 (2014)
Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proceedings of American Association for Artificial Intelligence, Integrating Multiple Learned Models Workshop, AAAI 1996, pp. 120–125 (1996)
Giacinto, G., Roli, F.: Design of effective neural network ensembles for image classification processes. J. Image Vis. Comput. 19(9/10), 699–707 (2001)
Sim, J., Wright, C.C.: The Kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys. Therapy 85(3), 257–268 (2005)
Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles. Mach. Learn. 51(2), 181–207 (2003)
Saitta, L.: Hypothesis diversity in ensemble classification. In: Esposito, F., Pivert, O., Hacid, M.-S., Raś, Z.W., Ferilli, S. (eds.) ISMIS 2015. LNCS (LNAI), vol. 9384, pp. 662–670. Springer, Heidelberg (2006). doi:10.1007/11875604_73
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning (ICML), pp. 148–156 (1996)
Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)
Rodrguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1619–1630 (2006)
Zhang, L., Zhou, W.D.: Sparse ensembles using weighted combination methods based on linear programming. Pattern Recogn. 44(1), 97–106 (2011)
Benediktsson, J.A., Sveinsson, J.R., Ersoy, O.K., Swain, P.H.: Parallel consensual neural networks. IEEE Trans. Neural Netw. 8(1), 54–64 (1997)
Ueda, N.: Optimal linear combination of neural networks for improving classification performance. IEEE Trans. Pattern Anal. Mach. Intell. 22(2), 207–215 (2000)
Kuncheva, L.I., Rodriguez, J.J.: A weighted voting framework for classifiers ensembles. Knowl. Inf. Syst. 38(2), 259–275 (2014)
Breiman, L.: Stacked regressions. Mach. Learn. 24(1), 49–64 (1996)
Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Trans. Syst. Man Cybern. Part B 28(3), 417–425 (1998)
Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1–2), 239–263 (2002)
Gurram, P., Kwon, H.: Sparse kernel-based ensemble learning with fully optimized kernel parameters for hyperspectral classification problems. IEEE Trans. Geosci. Remote Sens. 51(2), 787–802 (2013)
Yin, X.-C., Huang, K., Hao, H.-W.: \(\rm De^2\): dynamic ensemble of ensembles for learning nonstationary data. Neurocomputing 165, 14–22 (2015)
Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, New York (2009)
Acknowledgments
This work was funded in part by the National Natural Science Foundation of China (No. 61572240,61502208), Natural Science Foundation of Jiangsu Province of China (No. BK20150522), and the Open Project Program of the National Laboratory of Pattern Recognition(NLPR) (No. 201600005).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Shen, XJ. et al. (2016). Building Locally Discriminative Classifier Ensemble Through Classifier Fusion Among Nearest Neighbors. In: Chen, E., Gong, Y., Tie, Y. (eds) Advances in Multimedia Information Processing - PCM 2016. PCM 2016. Lecture Notes in Computer Science(), vol 9916. Springer, Cham. https://doi.org/10.1007/978-3-319-48890-5_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-48890-5_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-48889-9
Online ISBN: 978-3-319-48890-5
eBook Packages: Computer ScienceComputer Science (R0)