Nothing Special   »   [go: up one dir, main page]

Skip to main content

Boosted Self–evolving Neural Networks for Pattern Recognition

  • Conference paper
  • First Online:
AI 2022: Advances in Artificial Intelligence (AI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13728))

Included in the following conference series:

  • 1719 Accesses

Abstract

It has been well documented that both boosting and bagging algorithms improve ensemble performance. However, these types of algorithms have only infrequently been applied to ensembles of constructivist learners which are based on neural networks. Although there have been previous attempts at developing similar ensemble learning algorithms for constructivist learners, our proposed approach also addresses the issue of ensuring more diversity of the learners in the ensemble and offers a different approach for handling imbalanced data sets. More specifically, this paper investigates how a modified version of the AdaBoost algorithm can be applied to generate an ensemble of simple incremental learning neural network-based constructivist learners known as the Self-Evolving Connectionist System (SECoS). We develop this boosting algorithm to leverage the accurate learning of the SECoS and to promote diversity in these SECoS learners in order to create an optimal model for classification tasks. Moreover, we adopt a similar minority class sampling method inspired by RUSBoost which addresses the class imbalance problem when learning from data. Our proposed AdaBoostedSECoS (ABSECoS) learning framework is compared with other ensemble-based methods using four benchmark data sets, three of which have class imbalance. The results of these experiments suggest ABSECoS performs comparably well against similar ensemble methods using boosting techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bergstra, J., Komer, B., Eliasmith, C., Yamins, D., Cox, D.D.: Hyperopt: a Python library for model selection and hyperparameter optimization. Comput. Sci. Discov. 8(1), 014008 (2015)

    Google Scholar 

  2. Breiman, L.: Random Forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  3. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: Smote: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16(1), 321–357 (2002)

    Article  MATH  Google Scholar 

  4. Chen, T., Guestrin, C.: XGBoost: A scalable tree boosting system. In: Proc. 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 785–794. KDD’16, ACM, New York, NY, USA (2016)

    Google Scholar 

  5. Dhiman, B., Kumar, Y., Kumar, M.: Fruit quality evaluation using machine learning techniques: review, motivation and future perspectives. Multimedia Tools and Applications , 81, 16255–16277 (2022)

    Google Scholar 

  6. Dong, X., Yu, Z., Cao, W., Shi, Y., Ma, Q.: A survey on ensemble learning. Front. Comput. Sci. 14(2), 241–258 (2019). https://doi.org/10.1007/s11704-019-8208-z

    Article  Google Scholar 

  7. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7(II), 7, 179–188 (1936)

    Google Scholar 

  8. Forina, M., Lanteri, S., Armanino, C., Casolino, C., Casale, M., Oliveri, P.: PARVUS - An Extendible Package for Data Exploration, Classification and Correlation. Institute of Pharmaceutical and Food Analysis and Technologies, Tech. rep., ip. Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita’ di Genova (2008)

    Google Scholar 

  9. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the Thirteenth International Conference In Machine Learning, pp. 18–156. IEEE Press (1996)

    Google Scholar 

  10. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MATH  Google Scholar 

  11. Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)

    Article  MATH  Google Scholar 

  12. Horton, P., Nakai, K.: A Probablistic Classification System for Predicting the Cellular Localization Sites of Proteins. In: 1996 International Conference on Intelligent Systems in Microbiology. vol. 4, pp. 109–115 (1996)

    Google Scholar 

  13. Kasabov, N.: ECOS: A Framework For Evolving Connectionist Systems and the ECO Learning Paradigm. In: Proceedings of the 1998 Conference on Neural Information Processing and Intelligent Information Systems, (ICONIP’1998), pp. 1232–1235. Ohmsha Ltd: Tokyo, Japan (1998)

    Google Scholar 

  14. Kasabov, N.: Evolving Connectionist and Fuzzy-Connectionist Systems for On-line Adaptive Decision Making and Control. In: Roy, R., Furuhashi, T., Chawdhry, P.K. (eds) Advances in Soft Computing. Springer, London (1999). https://doi.org/10.1007/978-1-4471-0819-1_3

  15. Kasabov, N.: The ECOS framework and the eco learning method for evolving connectionist systems. J. Adv. Comput. Intell. 2(6), 195–202 (1998)

    Google Scholar 

  16. Kasabov, N.: Evolving Fuzzy Neural Networks for Supervised/Unsupervised On-Line, Knowledge-Based Learning. In: IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, vol. 31, no. 6, pp. 902–918 (2001)

    Google Scholar 

  17. Kasabov, N., Woodford, B.: Rule insertion and rule extraction from evolving fuzzy neural networks: algorithms and applications for building adaptive, intelligent expert systems. In: Proceedings of the 1999 IEEE Fuzzy Systems Conference. vol. 3, pp. 1406–1411. The IEEE, Kyunghee Printing Co (1999)

    Google Scholar 

  18. Minku, F.L., Ludermir, T.B.: EFuNN Ensembles Construction Using CONE with Multi-objective GA. In: 2006 Ninth Brazilian Symposium on Neural Networks (SBRN’06), pp. 48–53 (2006)

    Google Scholar 

  19. Minku, F.L., Ludermir, T.B.: EFuNNs Ensembles Construction Using a Clustering Method and a Coevolutionary Genetic Algorithm. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 1399–1406 (2006)

    Google Scholar 

  20. Nemenyi, P.: Distribution-free Multiple Comparisons. Ph.D. thesis, Princeton University (1963)

    Google Scholar 

  21. Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MATH  Google Scholar 

  22. Sagi, O., Rokach, L.: Ensemble learning: a survey. WIREs Data Mining Knowl. Dis. 8(4), 241–258 (2018)

    Google Scholar 

  23. Seiffert, C., Khoshgoftaar, T.M., Van Hulse, J., Napolitano, A.: RUSBoost: A Hybrid Approach to Alleviating Class Imbalance. In: IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 40, no. 1, pp. 185–197 (2010)

    Google Scholar 

  24. Shi, H., Lv, X.: The Naïve Bayesian Classifier Learning Algorithm Based on Adaboost and Parameter Expectations. In: 2010 Third International Joint Conference on Computational Science and Optimization. vol. 2, pp. 377–381 (2010)

    Google Scholar 

  25. Song, Q., Kasabov, N.: DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans. Fuzzy Syst. 10(2), 144–154 (2001)

    Google Scholar 

  26. Tharwat, A.: Classification assessment methods. Appl. Comput. Inf. 17(1), 168–192 (2021)

    Google Scholar 

  27. Watts, M.: A Decade of Kasabov’s Evolving Connectionist Systems: A Review. IEEE Trans. Syst. Man Cybern - Part C: Appl. Rev. 39(6), 684–693 (2009)

    Google Scholar 

  28. Wolberg, W., Mangasarian, O.: Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc. Nat. Acad. Sci. 87, 9193–9196 (1990)

    Article  MATH  Google Scholar 

  29. Woodford, B.J., Kasabov, N.K.: Ensembles of EFuNNs: an architecture for a multi module classifier. In: The proceedings of FUZZ-IEEE’2001. In: The 10th IEEE International Conference on Fuzzy Systems. vol. III, pp. 1573–1576. IEEE (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Brendon J. Woodford .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Woodford, B.J. (2022). Boosted Self–evolving Neural Networks for Pattern Recognition. In: Aziz, H., Corrêa, D., French, T. (eds) AI 2022: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13728. Springer, Cham. https://doi.org/10.1007/978-3-031-22695-3_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22695-3_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22694-6

  • Online ISBN: 978-3-031-22695-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics