Part of Advances in Neural Information Processing Systems 26 (NIPS 2013)
Vincent Q. Vu, Juhee Cho, Jing Lei, Karl Rohe
We propose a novel convex relaxation of sparse principal subspace estimation based on the convex hull of rank-$d$ projection matrices (the Fantope). The convex problem can be solved efficiently using alternating direction method of multipliers (ADMM). We establish a near-optimal convergence rate, in terms of the sparsity, ambient dimension, and sample size, for estimation of the principal subspace of a general covariance matrix without assuming the spiked covariance model. In the special case of $d=1$, our result implies the near- optimality of DSPCA even when the solution is not rank 1. We also provide a general theoretical framework for analyzing the statistical properties of the method for arbitrary input matrices that extends the applicability and provable guarantees to a wide array of settings. We demonstrate this with an application to Kendall's tau correlation matrices and transelliptical component analysis.