Abstract
Principal component analysis is the most widely used method for linear dimensionality reduction, due to its effectiveness in exploring low-dimensional global geometric structures embedded in data. To preserve the intrinsic local geometrical structures of data, graph-Laplacian PCA (gLPCA) incorporates Laplacian embedding into PCA framework for learning local similarities between data points, which leads to significant performance improvement in clustering and classification. Some recent works showed that not only the high dimensional data reside on a low-dimensional manifold in the data space, but also the features lie on a manifold in feature space. However, both PCA and gLPCA overlook the local geometric information contained in the feature space. By considering the duality between data manifold and feature manifold, graph-dual Laplacian PCA (gDLPCA) is proposed, which incorporates data graph regularization and feature graph regularization into PCA framework to exploit local geometric structures of data manifold and feature manifold simultaneously. The experimental results on four benchmark data sets have confirmed its effectiveness and suggested that gDLPCA outperformed gLPCA on classification and clustering tasks.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Bao BK, Liu G, Xu C, Yan S (2012) Inductive robust principal component analysis. IEEE Trans Image Process 21:3794–3800
Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. Adv Neural Inf Process Syst 14:585–591
Bi M, Xu J, Wang M, Zhou F (2016) Anomaly detection model of user behavior based on principal component analysis. J Ambient Intell Hum Comput 7:547–554. https://doi.org/10.1007/s12652-015-0341-4
Brooks J, Dulá J, Boone E (2013) A pure L1-norm principal component analysis. Comput Stat Data Anal 61:83
Cai D, He X, Han J, Huang TS (2011) Graph regularized nonnegative matrix factorization for data representation. IEEE Trans Pattern Anal Mach Intell 33:1548–1560
Candes EJ, Li X, Ma Y, Wright J (2009) Robust principal component analysis? J ACM 58(3):11
Gu B, Sheng VS (2017) A robust regularization path algorithm for ν-support vector classification. IEEE Trans Neural Netw Learn Syst 28:1241–1248
Gu Q, Zhou J (2009) Co-clustering on manifolds. In: Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, pp 359–368
Gu B, Sheng VS, Wang Z, Ho D, Osman S, Li S (2015) Incremental learning for ν-support vector regression. Neural Netw 67:140–150
Gu B, Sun X, Sheng VS (2016) Structural minimax probability machine. IEEE Trans Neural Netw Learn Syst 28:1646–1656
Guan R, Wang X, Marchese M, Yang MQ, Liang Y, Yang C (2018) Feature space learning model. J Ambient Intell Hum Comput. https://doi.org/10.1007/s12652-018-0805-4
Guo Z, Liu G, Li D, Wang S (2017a) Self-adaptive differential evolution with global neighborhood search. Soft Comput 21:3759–3768
Guo Z, Wang S, Yue X, Yang H (2017b) Global harmony search with generalized opposition-based learning. Soft Comput 21:2129–2137
He X, Niyogi P (2004) Locality preserving projections. In: Advances in neural information processing systems. MIT, London, pp 153–160
Jiang B, Ding C, Luo B, Tang J, Graph-Laplacian PCA (2013) Closed-form solution and robustness. In: Computer vision and pattern recognition. Taylor & Francis, Routledge, pp 3492–3498
Jin T, Yu J, You J, Zeng K, Li C, Yu Z (2015) Low-rank matrix factorization with multiple hypergraph regularizer. Pattern Recogn 48:1011–1022
Jolliffe IT (2011) Principal component analysis. J Mark Res 87:513
Kargupta H, Huang W, Sivakumar K, Johnson E (2001) Distributed clustering using collective principal component analysis. Knowl Inf Syst 3:422–448
Kwak N (2014) Principal component analysis by-norm maximization. IEEE Trans Cybern 44:594–609
Lee D (1999) Learning the parts of objects with nonnegative matrix factorization. Nature 401:788
Liang Z, Xia S, Zhou Y, Zhang L, Li Y (2013) Feature extraction based on Lp-norm generalized principal component analysis. Pattern Recogn Lett 34:1037–1045
Lin G, Tang N, Wang H (2014) Locally principal component analysis based on L1-norm maximisation. Image Process Iet 9:91–96
Lovász L, Plummer MD (2009) Matching theory, vol 367. American Mathematical Society, Providence
Powers DM (2011) Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation
Ren Y, Shen J, Wang J, Han J, Lee S (2015) Mutual verifiable provable data auditing in public cloud storage. J Int Technol 16(2):317–323
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326
Shahid N, Kalofolias V, Bresson X, Bronstein M, Vandergheynst P (2015) Robust principal component analysis on graphs. In: IEEE international conference on computer vision, pp 2812–2820
Shahid N, Perraudin N, Kalofolias V, Ricaud B, Vandergheynst P (2016) PCA using graph total variation. In: IEEE international conference on acoustics, speech and signal processing
Shang F, Jiao LC, Wang F (2012) Graph dual regularization non-negative matrix factorization for co-clustering. Pattern Recogn 45:2237–2250
Shang R, Zhang Z, Jiao L, Liu C, Li Y (2016) Self-representation based dual-graph regularized feature selection clustering. Neurocomputing 171:1242–1253
Shen J, Tan H, Wang J, Wang J, Lee S (2015) A novel routing protocol providing good transmission reliability in underwater sensor networks. J Internet Technol 16:171–178
Sindhwani V, Hu J (2009) Mojsilovic a regularized co-clustering with dual supervision. In: Advances in Neural information processing systems, pp 1505–1512
Smola AJ (1997) Kernel principal component analysis. In: International conference on artificial neural networks, pp 583–588
Sturn A, Quackenbush J, Trajanoski Z (2002) Genesis: cluster analysis of microarray data. Bioinformatics 18:207–208
Sun Q, Xiang S, Ye J (2013) Robust principal component analysis via capped norms. In: ACM SIGKDD international conference on knowledge discovery and data mining, pp 311–319
Turk MA, Pentland AP (2002) Face recognition using eigenfaces. In: Computer vision and pattern recognition, 1991. proceedings CVPR ‘91., IEEE computer society conference on, pp 586–591
Wang H (2012) Block principal component analysis with L1-norm for image analysis. Pattern Recogn Lett 33:537–542
Wang J (2016) Generalized 2-D principal component analysis by Lp-norm for image analysis. IEEE Trans Cybern 46:792–803
Wen X, Shao L, Xue Y, Fang W (2015) A rapid learning algorithm for vehicle classification. Inf Sci 295:395–406
Yan S, Xu D, Zhang B, Zhang H-J, Yang Q, Lin S (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29:40–51
Yang T, Gao X, Sellars S, Sorooshian S (2015) Improving the multi-objective evolutionary optimization algorithm for hydropower reservoir operations in the California Oroville-Thermalito complex environmental. Model Softw 69:262–279
Yang T, Asanjan AA, Faridzad M, Hayatbini N, Gao X, Sorooshian S (2017) An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis. Inf Sci 418:302–316
Yang T, Tao Y, Li J, Zhu Q, Su L, He X, Zhang X (2018) Multi-criterion model ensemble of CMIP5 surface air temperature over China. Theor Appl Climatol 132:1057–1072
Yin M, Gao J, Lin Z, Shi Q, Guo Y (2015) Dual graph regularized latent low-rank representation for subspace clustering. IEEE Trans Image Process 24:4918–4933
Zhang Z, Zhao K (2013) Low-rank matrix approximation with manifold regularization. IEEE Trans Pattern Anal Mach Intell 35:1717–1729
Zhou J, Tang M, Tian Y, Al-Dhelaan A, Al-Rodhaan M, LEE S (2015) Social network and tag sources based augmenting collaborative recommender system. IEICE Trans Inf Syst 98:902–910
Zou H, Hastie T, Tibshirani R (2006) Sparse principal component analysis. J Comput Gr Stat 15:265–286
Acknowledgements
The authors would like to thank the anonymous reviewers and the editor for their helpful comments and suggestions to improve the quality of this paper. We also thank Zhaolu Guo of Jiangxi University of Science and Technology for helpful discussions and Jie Su of College of Information Engineering in Northwest A&F University for his much experimental works. This work was supported in part by National Natural Science Foundation of China under Grant 61602388, the China Postdoctoral Science Foundation under Grant 2018M633585, Natural Science Basic Research Plan in Shaanxi Province of China under Grant 2018JQ6060, Doctoral Starting up Foundation of Northwest A&F University under Grant 2452015302, and the Open Project Program of the National Laboratory of Pattern Recognition (NLPR) under Grant 201700009.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
He, J., Bi, Y., Liu, B. et al. Graph-dual Laplacian principal component analysis. J Ambient Intell Human Comput 10, 3249–3262 (2019). https://doi.org/10.1007/s12652-018-1096-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12652-018-1096-5