Fundamental limits of low-rank matrix estimation with diverging aspect ratios
From MaRDI portal
Publication:6621532
DOI10.1214/24-aos2400MaRDI QIDQ6621532
Publication date: 18 October 2024
Published in: The Annals of Statistics (Search for Journal in Brave)
Random matrices (probabilistic aspects) (60B20) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse principal component analysis and iterative thresholding
- Optimal detection of sparse principal components in high dimension
- The singular values and vectors of low rank perturbations of large rectangular random matrices
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
- The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices
- A spectral algorithm for learning mixture models
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Fundamental limits of symmetric low-rank matrix estimation
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
- On the distribution of the largest eigenvalue in principal components analysis
- Sharp optimal recovery in the two component Gaussian mixture model
- Statistical limits of spiked tensor models
- Testing in high-dimensional spiked models
- When do birds of a feather flock together? \(k\)-means, proximity, and conic programming
- The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference
- Partial recovery bounds for clustering with the relaxed \(K\)-means
- Computational and statistical boundaries for submatrix localization in a large noisy matrix
- CHIME: clustering of high-dimensional Gaussian mixtures with EM algorithm and its optimality
- Eigenvalues of large sample covariance matrices of spiked population models
- Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices
- Estimation of low-rank matrices via approximate message passing
- Testing for high-dimensional geometry in random graphs
- Sparse PCA via Covariance Thresholding
- Efficiently learning mixtures of two Gaussians
- Generalized power method for sparse principal component analysis
- Relax, No Need to Round
- Improved Spectral-Norm Bounds for Clustering
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- Isotropic PCA and Affine-Invariant Clustering
- Community Detection and Stochastic Block Models
- Submatrix localization via message passing
- Asymptotic mutual information for the balanced binary stochastic block model
- Clustering subgaussian mixtures by semidefinite programming
- Information-Theoretic Bounds and Phase Transitions in Clustering, Sparse PCA, and Submatrix Localization
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- PAC Learning Axis-Aligned Mixtures of Gaussians with No Separation Assumption
- Approximating K‐means‐type Clustering via Semidefinite Programming
- Learning Theory
- Polynomial Learning of Distribution Families
- Contributions to the mathematical theory of evolution.
- Spiked singular values and vectors under extreme aspect ratios
- Mutual information for the sparse stochastic block model
- Empirical Bayes PCA in high dimensions
This page was built for publication: Fundamental limits of low-rank matrix estimation with diverging aspect ratios