
Statistics Papers
Document Type
Journal Article
Date of this Version
2013
Publication Source
The Annals of Statistics
Volume
41
Issue
6
Start Page
3074
Last Page
3110
DOI
10.1214/13-AOS1178
Abstract
Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. This paper considers both minimax and adaptive estimation of the principal subspace in the high dimensional setting. Under mild technical conditions, we first establish the optimal rates of convergence for estimating the principal subspace which are sharp with respect to all the parameters, thus providing a complete characterization of the difficulty of the estimation problem in term of the convergence rate. The lower bound is obtained by calculating the local metric entropy and an application of Fano’s lemma. The rate optimal estimator is constructed using aggregation, which, however, might not be computationally feasible.
We then introduce an adaptive procedure for estimating the principal subspace which is fully data driven and can be computed efficiently. It is shown that the estimator attains the optimal rates of convergence simultaneously over a large collection of the parameter spaces. A key idea in our construction is a reduction scheme which reduces the sparse PCA problem to a high-dimensional multivariate regression problem. This method is potentially also useful for other related problems.
Keywords
adaptive estimation, aggregation, covariance matrix, eigenvector, group sparsity, low-rank matrix, minimax lower bound, optimal rate of convergence, principal component analysis, thresholding
Recommended Citation
Cai, T., Ma, Z., & Wu, Y. (2013). Sparse PCA: Optimal Rates and Adaptive Estimation. The Annals of Statistics, 41 (6), 3074-3110. http://dx.doi.org/10.1214/13-AOS1178
Date Posted: 27 November 2017
This document has been peer reviewed.