Abstract
An SVM-like framework provides a novel way to learn linear principal component analysis (PCA). Actually it is a weighted PCA and leads to a semi-definite optimization problem (SDP). In this paper, we learn linear and nonlinear PCA with linear programming problems, which are easy to be solved and can obtain the unique global solution. Moreover, two algorithms for learning linear and nonlinear PCA are constructed, and all principal components can be obtained. To verify the performance of the proposed method, a series of experiments on artificial datasets and UCI benchmark datasets are accomplished. Simulation results demonstrate that the proposed method can compete with or outperform the standard PCA and kernel PCA (KPCA) in generalization ability but with much less memory and time consuming.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Barzilay O, Brailovsky V (1999) On domain knowledge and feature selection using a support vector machine. Pattern Recognition Lett 20: 475–484
Brown M, Grundy W, Lin D, Cristianini N, Sugnet C, Furey T, Ares M, Haussler D (2000) Knowledge-based analysis of microarray gene expression data by using support vector machines. Proc Nat Acad Sci USA 97: 262–267
Croux C, Haesbroeck G (2000) Principal component analysis based on robust estimators of the covariance or correlation matrix: influence functions and efficiencies. Biometrika 87: 603–618
Diamantaras KI, Kung SY (1996) Principal component neural networks. Wiley, New York
Drucker H, Wu D, Vapnik V (1999) Support vector machines for spam categorization. IEEE Trans Neural Networks 10: 1048–1054
Higuchi I, Eguchi S (2004) Robust principal component analysis with adaptive selection for tuning parameters. J Mach Learn Res 5: 453–471
Joachims T (1998) Text categorization with support vector machines: learning with many relevant features. In: Proceedings of the European Conference on Machine Learning. Springer, Berlin, pp 137–142
Jolliffe IT (1986) Principal component analysis. Springer, New York
Shawe-Taylor J, Cristianini Nello (2005) Kernel methods for pattern analysis. China Machine press, Beijing
Schölkopf B, Smolar AJ, Muller KR (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10: 1299–1319
Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural process Lett 9(3): 293–300
Suykens JAK, Van Gestel T, Vandewalle J, De Moor B (2003) A support vector machine formulation to PCA analysis and its kernel version. IEEE Trans Neural Networks 14(2): 447–450
Tao Q, Wu G, Wang J (2005) A new maximum margin algorithm for one–class problems and its boosting implementation. Pattern Recognition 38: 1071–1077
Tao Q, Wu G, Wang J (2007) Learning linear PCA with convex semi-definite programming. Pattern Recognition 40(10): 2633–2640
Tax D, Duin R (2004) Support vector data description. Mach Learn 54: 45–66
Vapnik V (1995) The nature of statistical learning theory. Springer, New York
Vapnik V (1998) Statistical learning theory. Wiley, New York
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, R., Wang, W. Learning Linear and Nonlinear PCA with Linear Programming. Neural Process Lett 33, 151–170 (2011). https://doi.org/10.1007/s11063-011-9170-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-011-9170-4