Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
From MaRDI portal
Publication:2195855
DOI10.1007/s11222-020-09939-5zbMath1448.62135arXiv1801.08227OpenAlexW3011407644MaRDI QIDQ2195855
Rahul Mazumder, Diego Saldana, Haolei Weng
Publication date: 27 August 2020
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.08227
Estimation in multivariate analysis (62H12) Inference from stochastic processes and spectral analysis (62M15) Convex programming (90C25) Matrix completion problems (15A83) Missing data (62D10)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Best subset selection via a modern optimization lens
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Convex analysis and nonlinear optimization. Theory and examples.
- One-step sparse estimates in nonconcave penalized likelihood models
- Spectral analysis of large dimensional random matrices
- Estimation of the mean of a multivariate normal distribution
- Regularization and the small-ball method. I: Sparse recovery
- Least angle regression. (With discussion)
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Weighted nuclear norm minimization and its applications to low level vision
- Sorted concave penalized regression
- A Bayesian approach for noisy matrix completion: optimal rate under general sampling distribution
- Noisy low-rank matrix completion with general sampling distribution
- Exact matrix completion via convex optimization
- An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion
- Guaranteed Matrix Completion via Non-Convex Factorization
- Incoherence-Optimal Matrix Completion
- A Singular Value Thresholding Algorithm for Matrix Completion
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Iteratively reweighted least squares minimization for sparse recovery
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Local Strong Homogeneity of a Regularized Estimator
- Does $\ell _{p}$ -Minimization Outperform $\ell _{1}$ -Minimization?
- Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
- Matrix Completion With Deterministic Pattern: A Geometric Perspective
- A Statistical View of Some Chemometrics Regression Tools
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
- A Simpler Approach to Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Low-rank matrix completion using alternating minimization
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Convex Analysis
- A general theory of concave regularization for high-dimensional sparse estimation problems