Abstract
It has been well documented that both boosting and bagging algorithms improve ensemble performance. However, these types of algorithms have only infrequently been applied to ensembles of constructivist learners which are based on neural networks. Although there have been previous attempts at developing similar ensemble learning algorithms for constructivist learners, our proposed approach also addresses the issue of ensuring more diversity of the learners in the ensemble and offers a different approach for handling imbalanced data sets. More specifically, this paper investigates how a modified version of the AdaBoost algorithm can be applied to generate an ensemble of simple incremental learning neural network-based constructivist learners known as the Self-Evolving Connectionist System (SECoS). We develop this boosting algorithm to leverage the accurate learning of the SECoS and to promote diversity in these SECoS learners in order to create an optimal model for classification tasks. Moreover, we adopt a similar minority class sampling method inspired by RUSBoost which addresses the class imbalance problem when learning from data. Our proposed AdaBoostedSECoS (ABSECoS) learning framework is compared with other ensemble-based methods using four benchmark data sets, three of which have class imbalance. The results of these experiments suggest ABSECoS performs comparably well against similar ensemble methods using boosting techniques.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bergstra, J., Komer, B., Eliasmith, C., Yamins, D., Cox, D.D.: Hyperopt: a Python library for model selection and hyperparameter optimization. Comput. Sci. Discov. 8(1), 014008 (2015)
Breiman, L.: Random Forests. Mach. Learn. 45(1), 5–32 (2001)
Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: Smote: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16(1), 321–357 (2002)
Chen, T., Guestrin, C.: XGBoost: A scalable tree boosting system. In: Proc. 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. pp. 785–794. KDD’16, ACM, New York, NY, USA (2016)
Dhiman, B., Kumar, Y., Kumar, M.: Fruit quality evaluation using machine learning techniques: review, motivation and future perspectives. Multimedia Tools and Applications , 81, 16255–16277 (2022)
Dong, X., Yu, Z., Cao, W., Shi, Y., Ma, Q.: A survey on ensemble learning. Front. Comput. Sci. 14(2), 241–258 (2019). https://doi.org/10.1007/s11704-019-8208-z
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7(II), 7, 179–188 (1936)
Forina, M., Lanteri, S., Armanino, C., Casolino, C., Casale, M., Oliveri, P.: PARVUS - An Extendible Package for Data Exploration, Classification and Correlation. Institute of Pharmaceutical and Food Analysis and Technologies, Tech. rep., ip. Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita’ di Genova (2008)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the Thirteenth International Conference In Machine Learning, pp. 18–156. IEEE Press (1996)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)
Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)
Horton, P., Nakai, K.: A Probablistic Classification System for Predicting the Cellular Localization Sites of Proteins. In: 1996 International Conference on Intelligent Systems in Microbiology. vol. 4, pp. 109–115 (1996)
Kasabov, N.: ECOS: A Framework For Evolving Connectionist Systems and the ECO Learning Paradigm. In: Proceedings of the 1998 Conference on Neural Information Processing and Intelligent Information Systems, (ICONIP’1998), pp. 1232–1235. Ohmsha Ltd: Tokyo, Japan (1998)
Kasabov, N.: Evolving Connectionist and Fuzzy-Connectionist Systems for On-line Adaptive Decision Making and Control. In: Roy, R., Furuhashi, T., Chawdhry, P.K. (eds) Advances in Soft Computing. Springer, London (1999). https://doi.org/10.1007/978-1-4471-0819-1_3
Kasabov, N.: The ECOS framework and the eco learning method for evolving connectionist systems. J. Adv. Comput. Intell. 2(6), 195–202 (1998)
Kasabov, N.: Evolving Fuzzy Neural Networks for Supervised/Unsupervised On-Line, Knowledge-Based Learning. In: IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, vol. 31, no. 6, pp. 902–918 (2001)
Kasabov, N., Woodford, B.: Rule insertion and rule extraction from evolving fuzzy neural networks: algorithms and applications for building adaptive, intelligent expert systems. In: Proceedings of the 1999 IEEE Fuzzy Systems Conference. vol. 3, pp. 1406–1411. The IEEE, Kyunghee Printing Co (1999)
Minku, F.L., Ludermir, T.B.: EFuNN Ensembles Construction Using CONE with Multi-objective GA. In: 2006 Ninth Brazilian Symposium on Neural Networks (SBRN’06), pp. 48–53 (2006)
Minku, F.L., Ludermir, T.B.: EFuNNs Ensembles Construction Using a Clustering Method and a Coevolutionary Genetic Algorithm. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 1399–1406 (2006)
Nemenyi, P.: Distribution-free Multiple Comparisons. Ph.D. thesis, Princeton University (1963)
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Sagi, O., Rokach, L.: Ensemble learning: a survey. WIREs Data Mining Knowl. Dis. 8(4), 241–258 (2018)
Seiffert, C., Khoshgoftaar, T.M., Van Hulse, J., Napolitano, A.: RUSBoost: A Hybrid Approach to Alleviating Class Imbalance. In: IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 40, no. 1, pp. 185–197 (2010)
Shi, H., Lv, X.: The Naïve Bayesian Classifier Learning Algorithm Based on Adaboost and Parameter Expectations. In: 2010 Third International Joint Conference on Computational Science and Optimization. vol. 2, pp. 377–381 (2010)
Song, Q., Kasabov, N.: DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans. Fuzzy Syst. 10(2), 144–154 (2001)
Tharwat, A.: Classification assessment methods. Appl. Comput. Inf. 17(1), 168–192 (2021)
Watts, M.: A Decade of Kasabov’s Evolving Connectionist Systems: A Review. IEEE Trans. Syst. Man Cybern - Part C: Appl. Rev. 39(6), 684–693 (2009)
Wolberg, W., Mangasarian, O.: Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc. Nat. Acad. Sci. 87, 9193–9196 (1990)
Woodford, B.J., Kasabov, N.K.: Ensembles of EFuNNs: an architecture for a multi module classifier. In: The proceedings of FUZZ-IEEE’2001. In: The 10th IEEE International Conference on Fuzzy Systems. vol. III, pp. 1573–1576. IEEE (2001)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Woodford, B.J. (2022). Boosted Self–evolving Neural Networks for Pattern Recognition. In: Aziz, H., Corrêa, D., French, T. (eds) AI 2022: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13728. Springer, Cham. https://doi.org/10.1007/978-3-031-22695-3_32
Download citation
DOI: https://doi.org/10.1007/978-3-031-22695-3_32
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22694-6
Online ISBN: 978-3-031-22695-3
eBook Packages: Computer ScienceComputer Science (R0)