Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs
From MaRDI portal
Publication:6109969
DOI10.1080/01621459.2021.1956501arXiv2008.01724OpenAlexW3204623575MaRDI QIDQ6109969
Yuxin Chen, Yuling Yan, Bingyan Wang, Jianqing Fan
Publication date: 4 July 2023
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2008.01724
nonconvex optimizationconvex relaxationblind deconvolutionleave-one-out analysisbilinear systems of equations
Cites Work
- Unnamed Item
- Unnamed Item
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- BranchHull: convex bilinear inversion from the entrywise product of signals with known signs
- Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- ROP: matrix recovery via rank-one projections
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Phase recovery, MaxCut and complex semidefinite programming
- Exact matrix completion via convex optimization
- .879-approximation algorithms for MAX CUT and MAX 2SAT
- PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming
- Guaranteed Matrix Completion via Non-Convex Factorization
- Identifiability in Blind Deconvolution With Subspace or Sparsity Constraints
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Robust Spectral Compressed Sensing via Structured Matrix Completion
- Blind Deconvolution Using Convex Programming
- Blind Recovery of Sparse Signals From Subsampled Convolution
- Robust principal component analysis?
- Rank-Sparsity Incoherence for Matrix Decomposition
- Self-calibration and biconvex compressive sensing
- Blind Demixing and Deconvolution at Near-Optimal Rate
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- GESPAR: Efficient Phase Retrieval of Sparse Signals
- Sparse Phase Retrieval via Truncated Amplitude Flow
- Nonconvex Demixing From Bilinear Measurements
- Near-Optimal Bounds for Phase Synchronization
- Fast and Guaranteed Blind Multichannel Deconvolution Under a Bilinear System Model
- Manifold Gradient Descent Solves Multi-Channel Sparse Blind Deconvolution Provably and Efficiently
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Nonconvex Low-Rank Tensor Completion from Noisy Data
- Convolutional Phase Retrieval via Gradient Descent
- Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
- Leave-One-Out Approach for Matrix Completion: Primal and Dual Analysis
- Multichannel Sparse Blind Deconvolution on the Sphere
- Structured Local Optima in Sparse Blind Deconvolution
- Inference and uncertainty quantification for noisy matrix completion
- Optimization-Based AMP for Phase Retrieval: The Impact of Initialization and $\ell_{2}$ Regularization
- Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Regularized gradient descent: a non-convex recipe for fast joint blind deconvolution and demixing
- Compressed Sensing Off the Grid
- Optimal Injectivity Conditions for Bilinear Inverse Problems with Applications to Identifiability of Deconvolution Problems
- Blind Deconvolution Meets Blind Demixing: Algorithms and Performance Bounds
- Low-rank matrix completion using alternating minimization