Abstract
Multi-label feature selection is crucial for managing feature redundancy and irrelevance in high-dimensional datasets. Existing methods reduce information redundancy through subspace dimensionality reduction but often suffer from instability due to high degrees of freedom and lack flexibility. This is because of the assumption of a shared subspace for features and labels, which leads to reduced performance. To address these problems, we introduce a novel multi-label feature selection approach. Specifically, we propose a dual subspace learning approach to capture both label correlations and feature correlations for feature selection. Therefore, our method can mitigate the adverse effects of noise, redundancy and imperfect features in the quest for discriminative features. Additionally, it reduces the sensitivity of the constructed model to the noise and outliers present in the data. Empirical experiments conducted on real-world datasets illustrate the efficiency and superiority of our proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Agrawal, R., Gupta, A., Prabhu, Y., Varma, M.: Multi-label learning with millions of labels: recommending advertiser bid phrases for web pages. In: Proceedings of the 22nd International Conference on World Wide Web, pp. 13–24 (2013)
Bertsekas, D.P.: Nonlinear programming. J. Oper. Res. Soc. 48(3), 334 (1997)
Braytee, A., Liu, W., Catchpoole, D.R., Kennedy, P.J.: Multi-label feature selection using correlation information. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp. 1649–1656 (2017)
Chang, X., Nie, F., Yang, Y., Huang, H.: A convex formulation for semi-supervised multi-label feature selection. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)
Dai, J., Huang, W., Zhang, C., Liu, J.: Multi-label feature selection by strongly relevant label gain and label mutual aid. Pattern Recogn. 145, 109945 (2024)
Fan, Y., Chen, B., Huang, W., Liu, J., Weng, W., Lan, W.: Multi-label feature selection based on label correlations and feature redundancy. Knowl. Based Syst. 241, 108256 (2022)
Fan, Y., Liu, J., Tang, J., Liu, P., Lin, Y., Du, Y.: Learning correlation information for multi-label feature selection. Pattern Recogn. 145, 109899 (2024)
Faraji, M., Seyedi, S.A., Tab, F.A., Mahmoodi, R.: Multi-label feature selection with global and local label correlation. Expert Syst. Appl. 246, 123198 (2024)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
He, Z., Lin, Y., Wang, C., Guo, L., Ding, W.: Multi-label feature selection based on correlation label enhancement. Inf. Sci. 647, 119526 (2023)
Hu, J., Li, Y., Gao, W., Zhang, P.: Robust multi-label feature selection with dual-graph regularization. Knowl. Based Syst. 203, 106126 (2020)
Hu, L., Li, Y., Gao, W., Zhang, P., Hu, J.: Multi-label feature selection with shared common mode. Pattern Recogn. 104, 107344 (2020)
Jian, L., Li, J., Shu, K., Liu, H.: Multi-label informed feature selection. In: IJCAI, vol. 16, pp. 1627–33 (2016)
Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
Li, X., Zhang, H., Zhang, R., Liu, Y., Nie, F.: Generalized uncorrelated regression with adaptive graph for unsupervised feature selection. IEEE Trans. Neural Netw. Learn. Syst. 30(5), 1587–1595 (2018)
Li, Z., Liu, J., Yang, Y., Zhou, X., Lu, H.: Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Trans. Knowl. Data Eng. 26(9), 2138–2150 (2013)
Ma, Z., Nie, F., Yang, Y., Uijlings, J.R., Sebe, N.: Web image annotation via subspace-sparsity collaborated feature selection. IEEE Trans. Multimedia 14(4), 1021–1030 (2012)
Ma, Z., Nie, F., Yang, Y., Uijlings, J.R., Sebe, N., Hauptmann, A.G.: Discriminating joint feature analysis for multimedia data understanding. IEEE Trans. Multimedia 14(6), 1662–1672 (2012)
Mayr, A., Klambauer, G., Unterthiner, T., Hochreiter, S.: DeepTox: toxicity prediction using deep learning. Front. Environ. Sci. 3, 80 (2016)
Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint 2, 1-norms minimization. In: Advances in Neural Information Processing Systems, vol. 23 (2010)
Pereira, R.B., Plastino, A., Zadrozny, B., Merschmann, L.H.: Categorizing feature selection methods for multi-label classification. Artif. Intell. Rev. 49, 57–78 (2018)
Sheikhpour, R., Sarram, M.A., Gharaghani, S., Chahooki, M.A.Z.: A robust graph-based semi-supervised sparse feature selection method. Inf. Sci. 531, 13–30 (2020)
Tan, Z., Wang, M., Xie, J., Chen, Y., Shi, X.: Deep semantic role labeling with self-attention. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Uricchio, T., Ballan, L., Seidenari, L., Del Bimbo, A.: Automatic image annotation via label transfer in the semantic space. Pattern Recogn. 71, 144–157 (2017)
Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2016)
Zhang, J., Luo, Z., Li, C., Zhou, C., Li, S.: Manifold regularized discriminative feature selection for multi-label learning. Pattern Recogn. 95, 136–150 (2019)
Zhang, M.L., Zhou, Z.H.: ML-KNN: a lazy learning approach to multi-label learning. Pattern Recogn. 40(7), 2038–2048 (2007)
Zhu, Y., Kwok, J.T., Zhou, Z.H.: Multi-label learning with global and local label correlation. IEEE Trans. Knowl. Data Eng. 30(6), 1081–1094 (2017)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zhou, Y., Yuan, B., Zhong, Y., Li, Y. (2024). Multi-label Robust Feature Selection via Subspace-Sparsity Learning. In: Wand, M., Malinovská, K., Schmidhuber, J., Tetko, I.V. (eds) Artificial Neural Networks and Machine Learning – ICANN 2024. ICANN 2024. Lecture Notes in Computer Science, vol 15016. Springer, Cham. https://doi.org/10.1007/978-3-031-72332-2_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-72332-2_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-72331-5
Online ISBN: 978-3-031-72332-2
eBook Packages: Computer ScienceComputer Science (R0)