Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A New Multi-classifier Ensemble Algorithm Based on D-S Evidence Theory

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Classifier ensemble is an important research content of ensemble learning, which combines several base classifiers to achieve better performance. However, the ensemble strategy always brings difficulties to integrate multiple classifiers. To address this issue, this paper proposes a multi-classifier ensemble algorithm based on D-S evidence theory. The principle of the proposed algorithm adheres to two primary aspects. (a) Four probability classifiers are developed to provide redundant and complementary decision information, which is regarded as independent evidence. (b) The distinguishing fusion strategy based on D-S evidence theory is proposed to combine the evidence of multiple classifiers to avoid the mis-classification caused by conflicting evidence. The performance of the proposed algorithm has been tested on eight different public datasets, and the results show higher performance than other methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Altman N (1992) An introduction to kernel and nearest-neighbor nonparametric regression. Am Stat 46(3):175–185. https://doi.org/10.1080/00031305.1992.10475879

    Article  MathSciNet  Google Scholar 

  2. Campos GO, Zimek A, Sander J, Campello R, Micenková B, Schubert E, Assent I, Houle ME (2015) On the evaluation of unsupervised outlier detection: measures, datasets, and an empirical study. Data Min Knowl Disc 30:891–927

    Article  MathSciNet  Google Scholar 

  3. Chen Q, Shi L, Na J, Ren X, Nan Y (2018) Adaptive echo state network control for a class of pure-feedback systems with input and output constraints. Neurocomputing 275:1370–1382. https://doi.org/10.1016/j.neucom.2017.09.083

    Article  Google Scholar 

  4. Chen W, Li Y, Tsangaratos P, Shahabi H, Ilia I, Xue W, Bian H (2020) Groundwater spring potential mapping using artificial intelligence approach based on kernel logistic regression, random forest, and alternating decision tree models. Appl Sci. https://doi.org/10.3390/app10020425

    Article  Google Scholar 

  5. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297. https://doi.org/10.1023/A:1022627411411

    Article  MATH  Google Scholar 

  6. Dempster AP (1967) Upper and lower probability inferences based on a sample from a finite univariate population. Biometrika 54(3–4):515–528. https://doi.org/10.1093/biomet/54.3-4.515

    Article  MathSciNet  Google Scholar 

  7. Deng W, Yao R, Zhao H, Yang X, Li G (2019) A novel intelligent diagnosis method using optimal LS-SVM with improved PSO algorithm. Soft Comput 23:2445–2462

    Article  Google Scholar 

  8. Denoeux T (1995) A k-nearest neighbor classification rule based on dempster-shafer theory. IEEE Trans Syst 25(5):804–813. https://doi.org/10.1109/21.376493

    Article  Google Scholar 

  9. Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Berlin, pp 1–15

  10. Duin RPW, Tax DMJ (2000) Experiments with classifier combining rules. In: Multiple classifier systems. Springer, Berlin, pp 16–29

  11. Erfani SM, Rajasegarar S, Karunasekera S, Leckie C (2016) High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning. Pattern Recognit 121–134

  12. Farooq A, Anwar S, Awais M, Rehman S (2017) A deep CNN based multi-class classification of Alzheimer’s disease using MRI. In: 2017 IEEE international conference on imaging systems and techniques (IST), pp 1–6

  13. Gerhardt N, Schwolow S, Rohn S, Pérez-Cacho PR, Galán-Soldevilla H, Arce L, Weller P (2019) Quality assessment of olive oils based on temperature-ramped HS-GC-IMS and sensory evaluation: comparison of different processing approaches by LDA, kNN, and SVM. Food Chem 278:720–728. https://doi.org/10.1016/j.foodchem.2018.11.095

    Article  Google Scholar 

  14. Hansen L, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001. https://doi.org/10.1109/34.58871

    Article  Google Scholar 

  15. Hasan Sonet KMM, Rahman MM, Mazumder P, Reza A, Rahman RM (2017) Analyzing patterns of numerously occurring heart diseases using association rule mining. In: 2017 twelfth international conference on digital information management (ICDIM), pp 38–45

  16. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501. https://doi.org/10.1016/j.neucom.2005.12.126

    Article  Google Scholar 

  17. Jaeger H (2007) Echo state network. Scholarpedia 2(9):2330. https://doi.org/10.4249/scholarpedia.2330

    Article  Google Scholar 

  18. Jaeger H, Haas H (2004) Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667):78–80. https://doi.org/10.1126/science.1091277

    Article  Google Scholar 

  19. Johnson JM, Khoshgoftaar TM (2019) Survey on deep learning with class imbalance. J Big Data 6(1):27. https://doi.org/10.1186/s40537-019-0192-5

    Article  Google Scholar 

  20. Kuncheva L (2002) Switching between selection and fusion in combining classifiers: an experiment. IEEE Trans Syst Man Cybernet Part B Cybernet Publ IEEE Syst Man Cybernet Soc 32(2):146

    Article  Google Scholar 

  21. Ma Q, Shen L, Chen W, Wang J, Wei J, Yu Z (2016) Functional echo state network for time series classification. Inf Sci 373:1–20. https://doi.org/10.1016/j.ins.2016.08.081

    Article  MATH  Google Scholar 

  22. Maldonado S, López J (2018) Dealing with high-dimensional class-imbalanced datasets: embedded feature selection for SVM classification. Appl Soft Comput 67:94–105. https://doi.org/10.1016/j.asoc.2018.02.051

    Article  Google Scholar 

  23. Martins JG, Oliveira LES, Sabourin R, Britto AS (2018) Forest species recognition based on ensembles of classifiers. In: 2018 IEEE 30th international conference on tools with artificial intelligence (ICTAI), pp 371–378. https://doi.org/10.1109/ICTAI.2018.00065

  24. Mirza B, Lin Z (2016) Meta-cognitive online sequential extreme learning machine for imbalanced and concept-drifting data classification. Neural Netw 80:79–94. https://doi.org/10.1016/j.neunet.2016.04.008

    Article  Google Scholar 

  25. Murugavel ASM, Ramakrishnan S (2016) Hierarchical multi-class SVM with elm kernel for epileptic EEG signal classification. Med Biol Eng Comput 54(1):149–161

    Article  Google Scholar 

  26. Alaa MB, Samy AN, Bassem A-M, Ahmed K, Musleh M, Eman A (2019) Predicting Liver patients using artificial neural network, pp 1–11

  27. Peng Y, Lin JR, Zhang JP, Hu ZZ (2017) A hybrid data mining approach on bim-based building operation and maintenance. Build Environ 126:483–495. https://doi.org/10.1016/j.buildenv.2017.09.030

    Article  Google Scholar 

  28. Platt J (1999) Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. In: Advances in large margin classifiers. MIT Press, pp 61–74

  29. Pławiak P (2017) Novel genetic ensembles of classifiers applied to myocardium dysfunction recognition based on ECG signals. Swarm Evolut Comput 39C(2018):192–208

    Google Scholar 

  30. Sagi O, Rokach L (2018) Ensemble learning: a survey. WIREs Data Mining Knowl Discov 8(4):e1249. https://doi.org/10.1002/widm.1249

    Article  Google Scholar 

  31. Saritas MM, Yasar A (2019) Performance analysis of ANN and Naive Bayes classification algorithm for data classification. Int J Intell Syst Appl Eng 7:88–91

    Article  Google Scholar 

  32. Shafer G (1978) A mathematical theory of evidence. Technometrics 20(1):106

    Article  Google Scholar 

  33. Sumaiya Thaseen I, Aswani Kumar C (2017) Intrusion detection model using fusion of chi-square feature selection and multi class SVM. J King Saud Univ Comput Inf Sci 29(4):462–472. https://doi.org/10.1016/j.jksuci.2015.12.004

    Article  Google Scholar 

  34. Tan CJ, Lim CP, Cheah Y (2014) A multi-objective evolutionary algorithm-based ensemble optimizer for feature selection and classification with neural network models. Neurocomputing 125:217–228. https://doi.org/10.1016/j.neucom.2012.12.057

    Article  Google Scholar 

  35. Uriz M, Paternain D, Bustince H, Galar M (2018) A first approach towards the usage of classifiers’ performance to create fuzzy measures for ensembles of classifiers: a case study on highly imbalanced datasets. In: 2018 IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–8. https://doi.org/10.1109/FUZZ-IEEE.2018.8491440

  36. Wang F, Zhang B, Chai S, Xia Y (2018) An extreme learning machine-based community detection algorithm in complex networks. Complexity 2018:1–10

    Google Scholar 

  37. Wang L, Wang Z, Liu S (2016) An effective multivariate time series classification approach using echo state network and adaptive differential evolution algorithm. Expert Syst Appl 43(C):237–249. https://doi.org/10.1016/j.eswa.2015.08.055

    Article  Google Scholar 

  38. Wei H, Kehtarnavaz N (2020) Simultaneous utilization of inertial and video sensing for action detection and recognition in continuous action streams. IEEE Sens J 20(11):6055–6063. https://doi.org/10.1109/JSEN.2020.2973361

    Article  Google Scholar 

  39. Weinberger KQ, Saul LK (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244

    MATH  Google Scholar 

  40. Xiao W, Zhang J, Li Y, Zhang S, Yang W (2017) Class-specific cost regulation extreme learning machine for imbalanced classification. Neurocomputing 261:70–82. https://doi.org/10.1016/j.neucom.2016.09.120

    Article  Google Scholar 

  41. Zhang L, Ding L, Wu X, Skibniewski MJ (2017) An improved dempster-shafer approach to construction safety risk perception. Knowl-Based Syst 132:30–46. https://doi.org/10.1016/j.knosys.2017.06.014

    Article  Google Scholar 

  42. Zhao K, Sun R, Li L, Hou M, Yuan G, Sun R (2021) An improved evidence fusion algorithm in multi-sensor systems. Appl Intell. https://doi.org/10.1007/s10489-021-02279-5

    Article  Google Scholar 

  43. Zhao K, Sun R, Li L, Hou M, Yuan G, Sun R (2021) An optimal evidential data fusion algorithm based on the new divergence measure of basic probability assignment. Soft Comput. https://doi.org/10.1007/s00500-021-06040-5

    Article  Google Scholar 

Download references

Acknowledgements

This research was funded by Application of collaborative precision positioning service for mass users (2016YFB0501805-1), National Development and Reform Commission integrated data service system infrastructure platform construction project (JZNYYY001), Guangxi Key Lab of Multi-source Information Mining & Security (MIMS21-M-04).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Ruizhi Sun or Gang Yuan.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, K., Li, L., Chen, Z. et al. A New Multi-classifier Ensemble Algorithm Based on D-S Evidence Theory. Neural Process Lett 54, 5005–5021 (2022). https://doi.org/10.1007/s11063-022-10845-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-022-10845-2

Keywords

Navigation