Nothing Special   »   [go: up one dir, main page]

Skip to main content

Building Locally Discriminative Classifier Ensemble Through Classifier Fusion Among Nearest Neighbors

  • Conference paper
  • First Online:
Advances in Multimedia Information Processing - PCM 2016 (PCM 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9916))

Included in the following conference series:

  • 2356 Accesses

Abstract

Many studies on ensemble learning that combines multiple classifiers have shown that, it is an effective technique to improve accuracy and stability of a single classifier. In this paper, we propose a novel discriminative classifier fusion method, which applies local classification results of classifiers among nearest neighbors to build a local classifier ensemble. From this dynamically selected process, discriminative classifiers are weighted heavily to build a locally discriminative ensemble. Experimental results on several UCI datasets have shown that, our proposed method achieves best classification performance among individual classifiers, majority voting and AdaBoost algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Boucheron, S., Bousquet, O., Lugosi, G.: Theory of classification: a survey of some recent advances. ESAIM: Probab. Stat. 9(1), 323–375 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  2. Egmont-Petersen, M., de Ridder, D., Handels, H.: Image processing with neural networks C a review. Pattern Recogn. 35(10), 2279–2301 (2002)

    Article  MATH  Google Scholar 

  3. Wang, L.: The Support Vector Machines: Theory and Applications. Springer, Berlin (2005)

    Book  MATH  Google Scholar 

  4. Quinlan, J.R.: Improved use of continuous attributes in C4.5. J. Artif. Intell. Res. 4, 77–90 (1996)

    MATH  Google Scholar 

  5. Rennie, J., Shih, L., Teevan, J., Karger, D.: Spam filtering with Naive Bayes which Naive Bayes? In: Proceedings of the Twentieth International Conference on Machine Learning (ICML), pp. 285–295 (2003)

    Google Scholar 

  6. Britto Jr., A.S., sabourin, R., Oliveira, L.S.: Dynamic selection of classifiers - a comprehensive review. Pattern Recogn. 47(11), 3665–3680 (2014)

    Article  Google Scholar 

  7. Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proceedings of American Association for Artificial Intelligence, Integrating Multiple Learned Models Workshop, AAAI 1996, pp. 120–125 (1996)

    Google Scholar 

  8. Giacinto, G., Roli, F.: Design of effective neural network ensembles for image classification processes. J. Image Vis. Comput. 19(9/10), 699–707 (2001)

    Article  Google Scholar 

  9. Sim, J., Wright, C.C.: The Kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys. Therapy 85(3), 257–268 (2005)

    Google Scholar 

  10. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles. Mach. Learn. 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  11. Saitta, L.: Hypothesis diversity in ensemble classification. In: Esposito, F., Pivert, O., Hacid, M.-S., Raś, Z.W., Ferilli, S. (eds.) ISMIS 2015. LNCS (LNAI), vol. 9384, pp. 662–670. Springer, Heidelberg (2006). doi:10.1007/11875604_73

    Chapter  Google Scholar 

  12. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  13. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning (ICML), pp. 148–156 (1996)

    Google Scholar 

  14. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)

    Article  Google Scholar 

  15. Rodrguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1619–1630 (2006)

    Article  Google Scholar 

  16. Zhang, L., Zhou, W.D.: Sparse ensembles using weighted combination methods based on linear programming. Pattern Recogn. 44(1), 97–106 (2011)

    Article  MATH  Google Scholar 

  17. Benediktsson, J.A., Sveinsson, J.R., Ersoy, O.K., Swain, P.H.: Parallel consensual neural networks. IEEE Trans. Neural Netw. 8(1), 54–64 (1997)

    Article  Google Scholar 

  18. Ueda, N.: Optimal linear combination of neural networks for improving classification performance. IEEE Trans. Pattern Anal. Mach. Intell. 22(2), 207–215 (2000)

    Article  MathSciNet  Google Scholar 

  19. Kuncheva, L.I., Rodriguez, J.J.: A weighted voting framework for classifiers ensembles. Knowl. Inf. Syst. 38(2), 259–275 (2014)

    Article  Google Scholar 

  20. Breiman, L.: Stacked regressions. Mach. Learn. 24(1), 49–64 (1996)

    MathSciNet  MATH  Google Scholar 

  21. Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Trans. Syst. Man Cybern. Part B 28(3), 417–425 (1998)

    MathSciNet  Google Scholar 

  22. Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1–2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  23. Gurram, P., Kwon, H.: Sparse kernel-based ensemble learning with fully optimized kernel parameters for hyperspectral classification problems. IEEE Trans. Geosci. Remote Sens. 51(2), 787–802 (2013)

    Article  Google Scholar 

  24. Yin, X.-C., Huang, K., Hao, H.-W.: \(\rm De^2\): dynamic ensemble of ensembles for learning nonstationary data. Neurocomputing 165, 14–22 (2015)

    Article  Google Scholar 

  25. Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, New York (2009)

    MATH  Google Scholar 

Download references

Acknowledgments

This work was funded in part by the National Natural Science Foundation of China (No. 61572240,61502208), Natural Science Foundation of Jiangsu Province of China (No. BK20150522), and the Open Project Program of the National Laboratory of Pattern Recognition(NLPR) (No. 201600005).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qian Zhu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Shen, XJ. et al. (2016). Building Locally Discriminative Classifier Ensemble Through Classifier Fusion Among Nearest Neighbors. In: Chen, E., Gong, Y., Tie, Y. (eds) Advances in Multimedia Information Processing - PCM 2016. PCM 2016. Lecture Notes in Computer Science(), vol 9916. Springer, Cham. https://doi.org/10.1007/978-3-319-48890-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-48890-5_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-48889-9

  • Online ISBN: 978-3-319-48890-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics