Minimax bounds for sparse PCA with noisy high-dimensional data
From MaRDI portal
Publication:366956
DOI10.1214/12-AOS1014zbMath1292.62071arXiv1203.0967WikidataQ30862008 ScholiaQ30862008MaRDI QIDQ366956
Debashis Paul, Aharon Birnbaum, Iain M. Johnstone, Boaz Nadler
Publication date: 25 September 2013
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1203.0967
Factor analysis and principal components; correspondence analysis (62H25) Asymptotic properties of nonparametric inference (62G20)
Related Items
Testing equivalence of clustering, A simultaneous test of mean vector and covariance matrix in high-dimensional settings, Optimal sparse eigenspace and low-rank density matrix estimation for quantum systems, Statistical inference for principal components of spiked covariance matrices, Influence diagnostics in support vector machines, Detection of hubs in complex networks by the Laplacian matrix, A high-dimensional test on linear hypothesis of means under a low-dimensional factor model, Sparse PCA-based on high-dimensional Itô processes with measurement errors, Recent developments in high dimensional covariance estimation and its related issues, a review, Bayesian inference for spectral projectors of the covariance matrix, Testing the order of a population spectral distribution for high-dimensional data, On two-sample mean tests under spiked covariances, On the optimality of sliced inverse regression in high dimensions, Asymptotically efficient estimation of smooth functionals of covariance operators, Optimal detection of sparse principal components in high dimension, Large covariance estimation through elliptical factor models, Minimax estimation in sparse canonical correlation analysis, Estimation of functionals of sparse covariance matrices, New asymptotic results in principal component analysis, Principal component analysis: a review and recent developments, Sparse constrained projection approximation subspace tracking, Unnamed Item, Sparse principal component analysis for high‐dimensional stationary time series, Limiting laws for divergent spiked eigenvalues and largest nonspiked eigenvalue of sample covariance matrices, Inference for low-rank models, Long random matrices and tensor unfolding, Minimax sparse principal subspace estimation in high dimensions, Sparse PCA: optimal rates and adaptive estimation, An asymptotically minimax kernel machine, Random matrix theory in statistics: a review, The spectral norm of random inner-product kernel matrices, Robust covariance estimation for approximate factor models, The limits of the sample spiked eigenvalues for a high-dimensional generalized Fisher matrix and its applications, Posterior contraction rates of the phylogenetic Indian buffet processes, Large Covariance Estimation by Thresholding Principal Orthogonal Complements, Sparse equisigned PCA: algorithms and performance bounds in the noisy rank-1 setting, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Rejoinder of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Robust covariance and scatter matrix estimation under Huber's contamination model, Optimality and sub-optimality of PCA. I: Spiked random matrix models, Detecting changes in the second moment structure of high-dimensional sensor-type data in a K-sample setting, Testing and estimating change-points in the covariance matrix of a high-dimensional time series, Random matrix theory and its applications, ROP: matrix recovery via rank-one projections, Sparsistency and agnostic inference in sparse PCA, Optimal estimation and rank detection for sparse spiked covariance matrices, Rate-optimal posterior contraction for sparse PCA, Do semidefinite relaxations solve sparse PCA up to the information limit?
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Sparse principal component analysis via regularized low rank matrix approximation
- Winding number criterion for existence and uniqueness of equilibrium in linear rational expectations models
- Optimal rates of convergence for covariance matrix estimation
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- Information-theoretic determination of minimax rates of convergence
- Sphere packings. (Edited by John Talbot)
- Principal component analysis.
- Consistency of sparse PCA in high dimension, low sample size contexts
- Sparse PCA: optimal rates and adaptive estimation
- Regularized estimation of large covariance matrices
- Eigenvalues of large sample covariance matrices of spiked population models
- Adaptive Thresholding for Sparse Covariance Matrix Estimation
- Probabilistic Principal Component Analysis
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Discussion
- Generalized Thresholding of Large Covariance Matrices
- Chi-square oracle inequalities
- Asymptotic Theory for Principal Component Analysis
- A Direct Formulation for Sparse PCA Using Semidefinite Programming