Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
From MaRDI portal
Publication:5131966
DOI10.1137/19M1290000zbMath1477.90060arXiv1902.07698OpenAlexW3094857757MaRDI QIDQ5131966
Cong Ma, Yuejie Chi, Yuxin Chen, Yuling Yan, Jianqing Fan
Publication date: 9 November 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.07698
Related Items
A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices, Nonconvex Low-Rank Tensor Completion from Noisy Data, Improved Performance Guarantees for Orthogonal Group Synchronization via Generalized Power Method, GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery, A universal rank approximation method for matrix completion, Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs, Compressed spectral screening for large-scale differential correlation analysis with application in selecting glioblastoma gene modules, Inference for low-rank models, Inference for low-rank completion without sample splitting with application to treatment effect estimation, A selective overview of deep learning, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization, Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data, Unnamed Item, Proof methods for robust low-rank matrix recovery
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sufficient forecasting using factor models
- Factor-Adjusted Regularized Model Selection
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Matrix completion via max-norm constrained optimization
- Fixed point and Bregman iterative methods for matrix rank minimization
- Estimation of high-dimensional low-rank matrices
- Angular synchronization by eigenvectors and semidefinite programming
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Factor models and variable selection in high-dimensional regression analysis
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- Theory of semidefinite programming for sensor network localization
- ``Preconditioning for feature selection and regression in high-dimensional problems
- Problems of distance geometry and convex properties of quadratic maps
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Robust covariance estimation for approximate factor models
- Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm
- Fast and provable algorithms for spectrally sparse signal reconstruction via low-rank Hankel matrix completion
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Entrywise eigenvector analysis of random matrices with low expected rank
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Approximate support recovery of atomic line spectral estimation: a tale of resolution and precision
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Noisy low-rank matrix completion with general sampling distribution
- Exact matrix completion via convex optimization
- Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
- Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Low-Rank Matrix Completion by Riemannian Optimization
- Guaranteed Matrix Completion via Non-Convex Factorization
- Incoherence-Optimal Matrix Completion
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Robust Spectral Compressed Sensing via Structured Matrix Completion
- Blind Deconvolution Using Convex Programming
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- Rank-Sparsity Incoherence for Matrix Decomposition
- Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
- Confidence Intervals for Diffusion Index Forecasts and Inference for Factor-Augmented Regressions
- Self-calibration and biconvex compressive sensing
- Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Blind Demixing and Deconvolution at Near-Optimal Rate
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Poisson Matrix Recovery and Completion
- Matrix Completion With Deterministic Pattern: A Geometric Perspective
- Near-Optimal Bounds for Phase Synchronization
- Estimating False Discovery Proportion Under Arbitrary Covariance Dependence
- Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization
- FarmTest: Factor-Adjusted Robust Multiple Testing With Approximate False Discovery Control
- Model-free Nonconvex Matrix Completion: Local Minima Analysis and Applications in Memory-efficient Kernel PCA
- Inference and uncertainty quantification for noisy matrix completion
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Correlated z-Values and the Accuracy of Large-Scale Statistical Estimates
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Matrix Completion From a Few Entries
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Correlation and Large-Scale Simultaneous Significance Testing
- Blind Deconvolution Meets Blind Demixing: Algorithms and Performance Bounds
- A Simpler Approach to Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Learning Theory
- Low-rank matrix completion using alternating minimization
- Large Covariance Estimation by Thresholding Principal Orthogonal Complements