Nothing Special   »   [go: up one dir, main page]

Skip to main content

Partial Multi-label Learning with Label and Feature Collaboration

  • Conference paper
  • First Online:
Database Systems for Advanced Applications (DASFAA 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12112))

Included in the following conference series:

Abstract

Partial multi-label learning (PML) models the scenario where each training instance is annotated with a set of candidate labels, and only some of the labels are relevant. The PML problem is practical in real-world scenarios, as it is difficult and even impossible to obtain precisely labeled samples. Several PML solutions have been proposed to combat with the prone misled by the irrelevant labels concealed in the candidate labels, but they generally focus on the smoothness assumption in feature space or low-rank assumption in label space, while ignore the negative information between features and labels. Specifically, if two instances have largely overlapped candidate labels, irrespective of their feature similarity, their ground-truth labels should be similar; while if they are dissimilar in the feature and candidate label space, their ground-truth labels should be dissimilar with each other. To achieve a credible predictor on PML data, we propose a novel approach called PML-LFC (Partial Multi-label Learning with Label and Feature Collaboration). PML-LFC estimates the confidence values of relevant labels for each instance using the similarity from both the label and feature spaces, and trains the desired predictor with the estimated confidence values. PML-LFC achieves the predictor and the latent label matrix in a reciprocal reinforce manner by a unified model, and develops an alternative optimization procedure to optimize them. Extensive empirical study on both synthetic and real-world datasets demonstrates the superiority of PML-LFC.

This work is supported by NSFC (61872300 and 61871010).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. JMLR 7(11), 2399–2434 (2006)

    MathSciNet  MATH  Google Scholar 

  2. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  3. Cour, T., Sapp, B., Taskar, B.: Learning from partial labels. JMLR 12(5), 1501–1536 (2011)

    MathSciNet  MATH  Google Scholar 

  4. Elisseeff, A., Weston, J.: A kernel method for multi-labelled classification. In: NeurIPS, pp. 681–687 (2002)

    Google Scholar 

  5. Fang, J.P., Zhang, M.L.: Partial multi-label learning via credible label elicitation. In: AAAI, pp. 3518–3525 (2019)

    Google Scholar 

  6. Gibaja, E., Ventura, S.: A tutorial on multilabel learning. ACM Comput. Surv. 47(3), 52 (2015)

    Article  Google Scholar 

  7. Han, Y., Sun, G., Shen, Y., Zhang, X.: Multi-label learning with highly incomplete data via collaborative embedding. In: KDD, pp. 1494–1503 (2018)

    Google Scholar 

  8. Ji, S., Ye, J.: An accelerated gradient method for trace norm minimization. In: ICML, pp. 457–464 (2009)

    Google Scholar 

  9. Li, S.Y., Jiang, Y., Chawla, N.V., Zhou, Z.H.: Multi-label learning from crowds. TKDE 31(7), 1369–1382 (2019)

    Google Scholar 

  10. Li, Y.F., Hu, J.H., Jiang, Y., Zhou, Z.H.: Towards discovering what patterns trigger what labels. In: AAAI, pp. 1012–1018 (2012)

    Google Scholar 

  11. Liu, T., Tao, D.: Classification with noisy labels by importance reweighting. TPAMI 38(3), 447–461 (2016)

    Article  Google Scholar 

  12. Natarajan, N., Dhillon, I.S., Ravikumar, P.K., Tewari, A.: Learning with noisy labels. In: NeurIPS, pp. 1196–1204 (2013)

    Google Scholar 

  13. Sun, L., Feng, S., Wang, T., Lang, C., Jin, Y.: Partial multi-label learning by low-rank and sparse decomposition. In: AAAI, pp. 5016–5023 (2019)

    Google Scholar 

  14. Sun, Y.Y., Zhang, Y., Zhou, Z.H.: Multi-label learning with weak label. In: AAAI, pp. 593–598 (2010)

    Google Scholar 

  15. Tan, Q., Liu, Y., Chen, X., Yu, G.: Multi-label classification based on low rank representation for image annotation. Remote Sens. 9(2), 109 (2017)

    Article  Google Scholar 

  16. Tan, Q., Yu, G., Domeniconi, C., Wang, J., Zhang, Z.: Incomplete multi-view weak-label learning. In: IJCAI, pp. 2703–2709 (2018)

    Google Scholar 

  17. Tu, J., Yu, G., Domeniconi, C., Wang, J., Xiao, G., Guo, M.: Multi-label answer aggregation based on joint matrix factorization. In: ICDM, pp. 517–526 (2018)

    Google Scholar 

  18. Wang, C., Yan, S., Zhang, L., Zhang, H.J.: Multi-label sparse coding for automatic image annotation. In: CVPR, pp. 1643–1650 (2009)

    Google Scholar 

  19. Wang, H., Liu, W., Zhao, Y., Zhang, C., Hu, T., Chen, G.: Discriminative and correlative partial multi-label learning. In: IJCAI, pp. 2703–2709 (2019)

    Google Scholar 

  20. Wu, B., Jia, F., Liu, W., Ghanem, B., Lyu, S.: Multi-label learning with missing labels using mixed dependency graphs. IJCV 126(8), 875–896 (2018)

    Article  MathSciNet  Google Scholar 

  21. Xie, M.K., Huang, S.J.: Partial multi-label learning. In: AAAI, pp. 4302–4309 (2018)

    Google Scholar 

  22. Xu, L., Wang, Z., Shen, Z., Wang, Y., Chen, E.: Learning low-rank label correlations for multi-label classification with missing labels. In: ICDM, pp. 1067–1072 (2014)

    Google Scholar 

  23. Yu, G., et al.: Feature-induced partial multi-label learning. In: ICDM, pp. 1398–1403 (2018)

    Google Scholar 

  24. Yu, G., Fu, G., Wang, J., Zhu, H.: Predicting protein function via semantic integration of multiple networks. TCBB 13(2), 220–232 (2016)

    Google Scholar 

  25. Zhang, J., Wu, X.: Multi-label inference for crowdsourcing. In: KDD, pp. 2738–2747 (2018)

    Google Scholar 

  26. Zhang, M.L., Yu, F., Tang, C.Z.: Disambiguation-free partial label learning. TKDE 29(10), 2155–2167 (2017)

    Google Scholar 

  27. Zhang, M.L., Zhang, K.: Multi-label learning by exploiting label dependency. In: KDD, pp. 999–1008 (2010)

    Google Scholar 

  28. Zhang, M.L., Zhou, Z.H.: ML-KNN: a lazy learning approach to multi-label learning. Pattern Recogn. 40(7), 2038–2048 (2007)

    Article  Google Scholar 

  29. Zhang, M.L., Zhou, Z.H.: A review on multi-label learning algorithms. TKDE 26(8), 1819–1837 (2014)

    Google Scholar 

  30. Zhang, Y., Zhou, Z.H.: Multilabel dimensionality reduction via dependence maximization. TKDD 4(3), 14 (2010)

    Article  Google Scholar 

  31. Zhu, Y., Kwok, J.T., Zhou, Z.H.: Multi-label learning with global and local label correlation. TKDE 30(6), 1081–1094 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guoxian Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, T., Yu, G., Wang, J., Guo, M. (2020). Partial Multi-label Learning with Label and Feature Collaboration. In: Nah, Y., Cui, B., Lee, SW., Yu, J.X., Moon, YS., Whang, S.E. (eds) Database Systems for Advanced Applications. DASFAA 2020. Lecture Notes in Computer Science(), vol 12112. Springer, Cham. https://doi.org/10.1007/978-3-030-59410-7_41

Download citation

Publish with us

Policies and ethics