Sparse PCA: optimal rates and adaptive estimation
From MaRDI portal
Publication:2443213
DOI10.1214/13-AOS1178zbMath1288.62099arXiv1211.1309MaRDI QIDQ2443213
Zongming Ma, T. Tony Cai, Yihong Wu
Publication date: 4 April 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1211.1309
aggregationeigenvectorscovariance matrixthresholdingoptimal rate of convergencegroup sparsityprincipal components analysislow-rank matrixminimax lower bounds
Factor analysis and principal components; correspondence analysis (62H25) Estimation in multivariate analysis (62H12) Minimax procedures in statistical decision theory (62C20)
Related Items
Testing equivalence of clustering, Wald Statistics in high-dimensional PCA, A simultaneous test of mean vector and covariance matrix in high-dimensional settings, Optimal sparse eigenspace and low-rank density matrix estimation for quantum systems, Robust \(\ell_1\) approaches to computing the geometric median and principal and independent components, A literature review of (Sparse) exponential family PCA, Estimation of conditional mean operator under the bandable covariance structure, A high-dimensional test on linear hypothesis of means under a low-dimensional factor model, Sparse PCA-based on high-dimensional Itô processes with measurement errors, Unnamed Item, Lower bounds for invariant statistical models with applications to principal component analysis, Bayesian inference for spectral projectors of the covariance matrix, Testing the order of a population spectral distribution for high-dimensional data, On two-sample mean tests under spiked covariances, On the optimality of sliced inverse regression in high dimensions, Minimax bounds for sparse PCA with noisy high-dimensional data, Sparse SIR: optimal rates and adaptive estimation, Efficient estimation of linear functionals of principal components, Optimal detection of sparse principal components in high dimension, Optimal estimation for lower bound of the packing number, Large covariance estimation through elliptical factor models, Factor models with local factors -- determining the number of relevant factors, Minimax estimation in sparse canonical correlation analysis, Solving sparse principal component analysis with global support, Optimal Permutation Recovery in Permuted Monotone Matrix Model, Estimation of functionals of sparse covariance matrices, Free Energy Wells and Overlap Gap Property in Sparse PCA, Consistent estimation of high-dimensional factor models when the factor number is over-estimated, Eigen Selection in Spectral Clustering: A Theory-Guided Practice, Large volatility matrix analysis using global and national factor models, Distributed Estimation for Principal Component Analysis: An Enlarged Eigenspace Analysis, Integrative Factor Regression and Its Inference for Multimodal Data Analysis, Re-thinking high-dimensional mathematical statistics. Abstracts from the workshop held May 15--21, 2022, Van Trees inequality, group equivariance, and estimation of principal subspaces, A penalty-free infeasible approach for a class of nonsmooth optimization problems over the Stiefel manifold, Bayesian sparse spiked covariance model with a continuous matrix shrinkage prior, Unnamed Item, Sparse principal component analysis for high‐dimensional stationary time series, Convergence of eigenvector empirical spectral distribution of sample covariance matrices, Nonasymptotic upper bounds for the reconstruction error of PCA, Limiting laws for divergent spiked eigenvalues and largest nonspiked eigenvalue of sample covariance matrices, Statistical and computational limits for sparse matrix detection, Euclidean Representation of Low-Rank Matrices and Its Geometric Properties, Inference for low-rank models, Long random matrices and tensor unfolding, Envelopes and principal component regression, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Convergence rates of eigenvector empirical spectral distribution of large dimensional sample covariance matrix, An inexact Riemannian proximal gradient method, Heterogeneity adjustment with applications to graphical model inference, Sparse PCA: optimal rates and adaptive estimation, Random matrix theory in statistics: a review, The spectral norm of random inner-product kernel matrices, Uniform Bounds for Invariant Subspace Perturbations, Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics, Near-optimal stochastic approximation for online principal component estimation, Detecting Markov random fields hidden in white noise, Posterior contraction rates of the phylogenetic Indian buffet processes, Projection tests for high-dimensional spiked covariance matrices, From simple structure to sparse components: a review, Convergence rate of Krasulina estimator, Subspace perspective on canonical correlation analysis: dimension reduction and minimax rates, Sparse Principal Component Analysis via Variable Projection, Distributed estimation of principal eigenspaces, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Projected principal component analysis in factor models, Minimax rates in network analysis: graphon estimation, community detection and hypothesis testing, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Principal component analysis in the local differential privacy model, Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data, Robust covariance and scatter matrix estimation under Huber's contamination model, Optimality and sub-optimality of PCA. I: Spiked random matrix models, Perturbation bounds for eigenspaces under a relative gap condition, Compressed covariance estimation with automated dimension learning, Provable accelerated gradient method for nonconvex low rank optimization, Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data, Sparse principal component analysis with missing observations, Recovering PCA from Hybrid-$(\ell_1,\ell_2)$ Sparse Sampling of Data Elements, Spectral thresholding for the estimation of Markov chain transition operators, The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics, ROP: matrix recovery via rank-one projections, Sparsistency and agnostic inference in sparse PCA, Unnamed Item, Optimal estimation and rank detection for sparse spiked covariance matrices, Rate-optimal posterior contraction for sparse PCA, Exact Penalty Function for $\ell_{2,1}$ Norm Minimization over the Stiefel Manifold, Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations, Do semidefinite relaxations solve sparse PCA up to the information limit?
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse principal component analysis and iterative thresholding
- Minimax bounds for sparse PCA with noisy high-dimensional data
- Optimal detection of sparse principal components in high dimension
- Exponential screening and optimal rates of sparse estimation
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Empirical processes with a bounded \(\psi_1\) diameter
- Oracle inequalities and optimal inference under group sparsity
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Optimal rates of convergence for sparse covariance matrix estimation
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Sparse principal component analysis via regularized low rank matrix approximation
- Optimal rates of convergence for covariance matrix estimation
- Covariance regularization by thresholding
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- PCA consistency in high dimension, low sample size context
- Information-theoretic determination of minimax rates of convergence
- Combining different procedures for adaptive regression
- Functional aggregation for nonparametric regression.
- Adaptive estimation of a quadratic functional by model selection.
- On the distribution of the largest eigenvalue in principal components analysis
- Adaptive covariance matrix estimation through block thresholding
- Consistency of sparse PCA in high dimension, low sample size contexts
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Sparse PCA: optimal rates and adaptive estimation
- Adapting to unknown sparsity by controlling the false discovery rate
- Eigenvalues of large sample covariance matrices of spiked population models
- Convergence of estimates under dimensionality restrictions
- Sparse Principal Component Analysis with Missing Observations
- Generalized power method for sparse principal component analysis
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Arbitrage, Factor Structure, and Mean-Variance Analysis on Large Asset Markets
- Sparse Variable PCA Using Geodesic Steepest Descent
- Learning Theory
- [https://portal.mardi4nfdi.de/wiki/Publication:4743580 Approximation dans les espaces m�triques et th�orie de l'estimation]
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Elements of Information Theory
- Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach
- The Rotation of Eigenvectors by a Perturbation. III
- Perturbation bounds in connection with singular value decomposition
- A Direct Formulation for Sparse PCA Using Semidefinite Programming
- Introduction to nonparametric estimation
- Gaussian model selection