Abstract
In this paper, we propose a solution to the instability problem of sparse coding with the technique of low-rank representation (LRR) which is a promising method of discovering subspace structures of data. Graph regularized sparse coding has been extensively studied for keeping the locality of the high-dimensional observations. However, in practice, data is always corrupted by noises such that samples from the same class may not inhabit the nearest area. To this end, we present a novel method for robust sparse representation, dubbed low-rank graph regularized sparse coding (LogSC). LogSC uses LRR to capture the multiple subspace structures of the data and aims to preserve this structure into the resultant sparse codes. Different from the traditional methods, our method, jointly rather than separately, learns the sparse codes and the LRR; our method maintains the global structure of the data no longer the local structure. Thus, the yielding sparse codes can be not only robust to the corrupted samples thanks to the LRR, but also discriminative arising from the multiple subspaces preserving. The optimization problem of LogSC can be effectively tackled by the linearized alternating direction method with adaptive penalty. To evaluate our approach, we apply LogSC for image clustering and classification, and meanwhile probe it in noisy scenes. The inspiring experimental results on the public image data sets manifest the discrimination, the robustness and the usability of the proposed LogSC.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2765–2781 (2013)
Zhang, Y., Xiang, M., Yang, B.: Graph regularized nonnegative sparse coding using incoherent dictionary for approximate nearest neighbor search. Pattern Recogn. 70, 75–88 (2017)
Zhang, Y., Xiang, M., Yang, B.: Linear dimensionality reduction based on hybrid structure preserving projections. Neurocomputing 173, 518–529 (2016)
Zheng, M., et al.: Graph regularized sparse coding for image representation. IEEE Trans. Image Process. 20(5), 1327–1336 (2011)
Gao, S., Tsang, I.W.-H., Chia, L.-T.: Laplacian sparse coding, hypergraph laplacian sparse coding and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 92–104 (2013)
Zhang, Z., Xu, Y., Yang, J., Li, X., Zhang, D.: A survey of sparse representation: algorithms and applications. IEEE Access 3, 490–530 (2015)
Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)
Balasubramanian, K., Yu, K., Lebanon, G.: Smooth sparse coding via marginal regression for learning sparse representations. Artif. Intell. 238, 83–95 (2016)
Jin, T., Yu, Z., Li, L., Li, C.: Multiple graph regularized sparse coding and multiple hypergraph regularized sparse coding for image representation. Neurocomputing 154, 245–256 (2015)
Feng, X., Wu, S., Zhou, W., Tang, Z.: Multi-hypergraph incidence consistent sparse coding for image data clustering. In: Bailey, J., Khan, L., Washio, T., Dobbie, G., Huang, J.Z., Wang, R. (eds.) PAKDD 2016. LNCS (LNAI), vol. 9652, pp. 79–91. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-31750-2_7
Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2013)
Yin, M., Gao, J., Lin, Z.: Laplacian regularized low-rank representation and its applications. IEEE Trans. Pattern Anal. Mach. Intell. 38(3), 504–517 (2016)
Zhang, Y., Xiang, M., Yang, B.: Low-rank preserving embedding. Pattern Recogn. 70, 112–125 (2017)
Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low-rank representation. In: Proceedings of Advance in Neural information Processing System, pp. 612–620 (2011)
Parikh, N., Boyd, S.: Proximal algorithms. Found. Trends Optim. 1(3), 127–239 (2014)
Xu, Y., Fang, X., Wu, J., Li, X., Zhang, D.: Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Trans. Image Process. 25(2), 850–863 (2016)
Cai, J.-F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
Zhang, Y., Xiang, M., Yang, B.: Hierarchical sparse coding from a Bayesian perspective. Neurocomputing 272, 279–293 (2018)
Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machine. ACM Trans. Intell. Syst. Technol. 2(3), Article no. 27 (2011)
Acknowledgments
This research was funded by the Fundamental Research Funds for the Central Universities (Grant No. G2018KY0301) and the National Natural Science Foundation of China (Grants No. 61332014 and 61772426).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, Y., Liu, S., Shang, X., Xiang, M. (2018). Low-Rank Graph Regularized Sparse Coding. In: Geng, X., Kang, BH. (eds) PRICAI 2018: Trends in Artificial Intelligence. PRICAI 2018. Lecture Notes in Computer Science(), vol 11012. Springer, Cham. https://doi.org/10.1007/978-3-319-97304-3_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-97304-3_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97303-6
Online ISBN: 978-3-319-97304-3
eBook Packages: Computer ScienceComputer Science (R0)