Nothing Special   »   [go: up one dir, main page]

Skip to main content

Low-Rank Graph Regularized Sparse Coding

  • Conference paper
  • First Online:
PRICAI 2018: Trends in Artificial Intelligence (PRICAI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11012))

Included in the following conference series:

Abstract

In this paper, we propose a solution to the instability problem of sparse coding with the technique of low-rank representation (LRR) which is a promising method of discovering subspace structures of data. Graph regularized sparse coding has been extensively studied for keeping the locality of the high-dimensional observations. However, in practice, data is always corrupted by noises such that samples from the same class may not inhabit the nearest area. To this end, we present a novel method for robust sparse representation, dubbed low-rank graph regularized sparse coding (LogSC). LogSC uses LRR to capture the multiple subspace structures of the data and aims to preserve this structure into the resultant sparse codes. Different from the traditional methods, our method, jointly rather than separately, learns the sparse codes and the LRR; our method maintains the global structure of the data no longer the local structure. Thus, the yielding sparse codes can be not only robust to the corrupted samples thanks to the LRR, but also discriminative arising from the multiple subspaces preserving. The optimization problem of LogSC can be effectively tackled by the linearized alternating direction method with adaptive penalty. To evaluate our approach, we apply LogSC for image clustering and classification, and meanwhile probe it in noisy scenes. The inspiring experimental results on the public image data sets manifest the discrimination, the robustness and the usability of the proposed LogSC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2765–2781 (2013)

    Article  Google Scholar 

  2. Zhang, Y., Xiang, M., Yang, B.: Graph regularized nonnegative sparse coding using incoherent dictionary for approximate nearest neighbor search. Pattern Recogn. 70, 75–88 (2017)

    Article  Google Scholar 

  3. Zhang, Y., Xiang, M., Yang, B.: Linear dimensionality reduction based on hybrid structure preserving projections. Neurocomputing 173, 518–529 (2016)

    Article  Google Scholar 

  4. Zheng, M., et al.: Graph regularized sparse coding for image representation. IEEE Trans. Image Process. 20(5), 1327–1336 (2011)

    Article  MathSciNet  Google Scholar 

  5. Gao, S., Tsang, I.W.-H., Chia, L.-T.: Laplacian sparse coding, hypergraph laplacian sparse coding and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 92–104 (2013)

    Article  Google Scholar 

  6. Zhang, Z., Xu, Y., Yang, J., Li, X., Zhang, D.: A survey of sparse representation: algorithms and applications. IEEE Access 3, 490–530 (2015)

    Article  Google Scholar 

  7. Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)

    Article  Google Scholar 

  8. Balasubramanian, K., Yu, K., Lebanon, G.: Smooth sparse coding via marginal regression for learning sparse representations. Artif. Intell. 238, 83–95 (2016)

    Article  MathSciNet  Google Scholar 

  9. Jin, T., Yu, Z., Li, L., Li, C.: Multiple graph regularized sparse coding and multiple hypergraph regularized sparse coding for image representation. Neurocomputing 154, 245–256 (2015)

    Article  Google Scholar 

  10. Feng, X., Wu, S., Zhou, W., Tang, Z.: Multi-hypergraph incidence consistent sparse coding for image data clustering. In: Bailey, J., Khan, L., Washio, T., Dobbie, G., Huang, J.Z., Wang, R. (eds.) PAKDD 2016. LNCS (LNAI), vol. 9652, pp. 79–91. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-31750-2_7

    Chapter  Google Scholar 

  11. Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2013)

    Article  Google Scholar 

  12. Yin, M., Gao, J., Lin, Z.: Laplacian regularized low-rank representation and its applications. IEEE Trans. Pattern Anal. Mach. Intell. 38(3), 504–517 (2016)

    Article  Google Scholar 

  13. Zhang, Y., Xiang, M., Yang, B.: Low-rank preserving embedding. Pattern Recogn. 70, 112–125 (2017)

    Article  Google Scholar 

  14. Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low-rank representation. In: Proceedings of Advance in Neural information Processing System, pp. 612–620 (2011)

    Google Scholar 

  15. Parikh, N., Boyd, S.: Proximal algorithms. Found. Trends Optim. 1(3), 127–239 (2014)

    Article  Google Scholar 

  16. Xu, Y., Fang, X., Wu, J., Li, X., Zhang, D.: Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Trans. Image Process. 25(2), 850–863 (2016)

    Article  MathSciNet  Google Scholar 

  17. Cai, J.-F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)

    Article  MathSciNet  Google Scholar 

  18. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)

    Article  MathSciNet  Google Scholar 

  19. Zhang, Y., Xiang, M., Yang, B.: Hierarchical sparse coding from a Bayesian perspective. Neurocomputing 272, 279–293 (2018)

    Article  Google Scholar 

  20. Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machine. ACM Trans. Intell. Syst. Technol. 2(3), Article no. 27 (2011)

    Article  Google Scholar 

Download references

Acknowledgments

This research was funded by the Fundamental Research Funds for the Central Universities (Grant No. G2018KY0301) and the National Natural Science Foundation of China (Grants No. 61332014 and 61772426).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Yupei Zhang or Xuequn Shang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Liu, S., Shang, X., Xiang, M. (2018). Low-Rank Graph Regularized Sparse Coding. In: Geng, X., Kang, BH. (eds) PRICAI 2018: Trends in Artificial Intelligence. PRICAI 2018. Lecture Notes in Computer Science(), vol 11012. Springer, Cham. https://doi.org/10.1007/978-3-319-97304-3_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97304-3_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97303-6

  • Online ISBN: 978-3-319-97304-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics