Nonasymptotic upper bounds for the reconstruction error of PCA
From MaRDI portal
Publication:2196210
DOI10.1214/19-AOS1839zbMath1450.62070arXiv1609.03779OpenAlexW3030181961MaRDI QIDQ2196210
Publication date: 28 August 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1609.03779
concentration inequalitiesreconstruction errorprincipal component analysis (PCA)excess riskspectral projectors
Factor analysis and principal components; correspondence analysis (62H25) Inequalities involving eigenvalues and eigenvectors (15A42) Large deviations (60F10)
Related Items
Non-asymptotic error bound for optimal prediction of function-on-function regression by RKHS approach, Inference in latent factor regression with clusterable features, Lower bounds for invariant statistical models with applications to principal component analysis, Certified dimension reduction in nonlinear Bayesian inverse problems, Asymptotically efficient estimation of smooth functionals of covariance operators, Compressive statistical learning with random feature moments, Efficient estimation of linear functionals of principal components, A note on the prediction error of principal component regression in high dimensions, Inference on the maximal rank of time-varying covariance matrices using high-frequency data, Van Trees inequality, group equivariance, and estimation of principal subspaces, Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data, Estimating covariance and precision matrices along subspaces, Higher-order principal component analysis for the approximation of tensors in tree-based low-rank formats, Distributed estimation of principal eigenspaces, Perturbation bounds for eigenspaces under a relative gap condition, Estimating multi-index models with response-conditional least squares, Principal component analysis for multivariate extremes, High-probability bounds for the reconstruction error of PCA, Statistical analysis of Mapper for stochastic and multivariate filters, Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching, Relative perturbation bounds with applications to empirical covariance operators
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimal eigen expansions and uniform bounds
- Inference for functional data with applications
- Concentration inequalities and moment bounds for sample covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Normal approximation and concentration of spectral projectors of sample covariance
- Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- New asymptotic results in principal component analysis
- Perturbation theory for linear operators.
- Principal component analysis.
- Concentration of norms and eigenvalues of random matrices
- Minimax sparse principal subspace estimation in high dimensions
- Sparse PCA: optimal rates and adaptive estimation
- High-dimensional principal projections
- Local Rademacher complexities
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- Theory for high-order bounds in functional principal components analysis
- An Introduction to Random Matrices
- Learning Theory
- Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- A useful variant of the Davis–Kahan theorem for statisticians
- Concentration Inequalities for Sums and Martingales
- Deviation Inequalities on Largest Eigenvalues
- Some new bounds on perturbation of subspaces