Abstract
Compared to conventional machine learning techniques, extreme learning machine (ELM) which trains single-hidden-layer feedforward neural networks (SLFNs) shows faster-learning speed and better generalization performances. However, like most representative supervised learning algorithms, ELM tends to produce biased decision models when datasets are imbalanced. In this paper, two-stage weighted regularized ELM is proposed to address the aforementioned issue. The original regularized ELM (RELM) was proposed to handle adverse effects of outliers but not target the imbalanced learning problem. So we proposed a new weighted regularized ELM (WRELM) for class imbalance learning (CIL) in the first stage. Different from the existing weighted ELM which only considers the class distribution of datasets, the proposed algorithm also puts more focus on hard, misclassified samples in the second stage. The focal loss function is adopted to update weight by decreasing the weight of well-classified samples to focus more attention on the error of difficult samples. The final decision target is determined by the winner-take-all method. We assess the proposed method on 25 binary datasets and 10 multiclass datasets by 5-folder cross validations. The results indicate the proposed algorithm is an efficient method for CIL and exceed other CIL algorithms based on ELM.
M. Xu—This work is supported by National Natural Science Foundation of China (NSFC) under grant 61473089.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
He, H., Garcia, E.A.: Learning from imbalanced data. IEEE Trans. Knowl. Data Eng. 9, 1263–1284 (2008)
Cheng, T.-H., Hu, P.J.-H.: A data-driven approach to manage the length of stay for appendectomy patients. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Hum. 39(6), 1339–1347 (2009)
Zakaryazad, A., Duman, E.: A profit-driven artificial neural network (ANN) with applications to fraud detection and direct marketing. Neurocomputing 175, 121–131 (2016)
Rao, R.B., Krishnan, S., Niculescu, R.S.: Data mining for improved cardiac care. ACM SIGKDD Explor. Newslett. 8(1), 3–10 (2006)
Rout, N., Mishra, D., Mallick, M.K.: Handling imbalanced data: a survey. In: Reddy, M.S., Viswanath, K., K.M., S.P. (eds.) International Proceedings on Advances in Soft Computing, Intelligent Systems and Applications. AISC, vol. 628, pp. 431–443. Springer, Singapore (2018). https://doi.org/10.1007/978-981-10-5272-9_39
Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: Smote: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16, 321–357 (2002)
Liu, X.-Y., Wu, J., Zhou, Z.-H.: Exploratory undersampling for class-imbalance learning. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 39(2), 539–550 (2009)
He, H., Bai, Y., Garcia, E.A., Li, S.: ADASYN: adaptive synthetic sampling approach for imbalanced learning. In: 2008 IEEE International Joint Conference on Neural Networks, IEEE World Congress on Computational Intelligence, IJCNN 2008, pp. 1322–1328. IEEE (2008)
Zong, W., Huang, G.-B., Chen, Y.: Weighted extreme learning machine for imbalance learning. Neurocomputing 101, 229–242 (2013)
Zhou, Z.-H., Liu, X.-Y.: Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Trans. Knowl. Data Eng. 18(1), 63–77 (2006)
Zhuang, L., Dai, H.: Parameter estimation of one-class SVM on imbalance text classification. In: Lamontagne, L., Marchand, M. (eds.) AI 2006. LNCS (LNAI), vol. 4013, pp. 538–549. Springer, Heidelberg (2006). https://doi.org/10.1007/11766247_46
Krawczyk, B., Schaefer, G.: An improved ensemble approach for imbalanced classification problems. In: 2013 IEEE 8th International Symposium on Applied Computational Intelligence and Informatics (SACI), pp. 423–426. IEEE (2013)
Huang, G.-B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(2), 513–529 (2012)
Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)
Wang, J., Zhang, L., Cao, J.-J., Han, D.: NBWELM: naive Bayesian based weighted extreme learning machine. Int. J. Mach. Learn. Cybern. 9, 1–15 (2014)
Deng, W., Zheng, Q., Chen, L.: Regularized extreme learning machine. In: 2009 IEEE Symposium on Computational Intelligence and Data Mining, CIDM 2009, pp. 389–395. IEEE (2009)
Lin, T.-Y., Goyal, P., Girshick, R., He, K., Dollar, P.: Focal loss for dense object detection. In: 2017 IEEE International Conference on Computer Vision (ICCV), pp. 2999–3007. IEEE (2017)
Yu, H., Sun, C., Yang, W., Yang, X., Zuo, X.: AL-ELM: one uncertainty-based active learning algorithm using extreme learning machine. Neurocomputing 166, 140–150 (2015)
Ruck, D.W., Rogers, S.K., Kabrisky, M., Oxley, M.E., Suter, B.W.: The multilayer perceptron as an approximation to a Bayes optimal discriminant function. IEEE Trans. Neural Netw. 1(4), 296–298 (1990)
Li, K., Kong, X., Lu, Z., Wenyin, L., Yin, J.: Boosting weighted ELM for imbalanced learning. Neurocomputing 128, 15–21 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Xu, M., Yu, Y. (2019). Two-Stage Weighted Regularized Extreme Learning Machine for Class Imbalanced Learning. In: Sun, F., Liu, H., Hu, D. (eds) Cognitive Systems and Signal Processing. ICCSIP 2018. Communications in Computer and Information Science, vol 1005. Springer, Singapore. https://doi.org/10.1007/978-981-13-7983-3_32
Download citation
DOI: https://doi.org/10.1007/978-981-13-7983-3_32
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-7982-6
Online ISBN: 978-981-13-7983-3
eBook Packages: Computer ScienceComputer Science (R0)