Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
From MaRDI portal
Publication:5240484
DOI10.1109/TSP.2019.2937282OpenAlexW2969215180MaRDI QIDQ5240484
Yuejie Chi, Yuxin Chen, Yue M. Lu
Publication date: 28 October 2019
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1809.09573
Related Items
An optimal statistical and computational framework for generalized tensor estimation, Analysis of Asymptotic Escape of Strict Saddle Sets in Manifold Optimization, Sketched learning for image denoising, Compressive Learning for Patch-Based Image Denoising, Role of sparsity and structure in the optimization landscape of non-convex matrix sensing, Optimization landscape of Tucker decomposition, A Nonlinear Matrix Decomposition for Mining the Zeros of Sparse Data, Unnamed Item, Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices, Nonconvex Low-Rank Tensor Completion from Noisy Data, Analytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity condition, GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery, Nonsmooth rank-one matrix factorization landscape, Geometry of Linear Convolutional Networks, Sharp global convergence guarantees for iterative nonconvex optimization with random data, Analysis of the optimization landscape of Linear Quadratic Gaussian (LQG) control, Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution, Structured Gradient Descent for Fast Robust Low-Rank Hankel Matrix Completion, Spurious Valleys, NP-Hardness, and Tractability of Sparse Matrix Factorization with Fixed Support, Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs, Fast gradient method for low-rank matrix estimation, Certifying the Absence of Spurious Local Minima at Infinity, Bayesian uncertainty quantification for low-rank matrix completion, A Generalization of Wirtinger Flow for Exact Interferometric Inversion, Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval, Algorithmic Regularization in Model-Free Overparametrized Asymmetric Matrix Factorization, Matrix completion with nonconvex regularization: spectral operators and scalable algorithms, Convergence of Random Reshuffling under the Kurdyka–Łojasiewicz Inequality, Nearly optimal bounds for the global geometric landscape of phase retrieval, Low-Rank Univariate Sum of Squares Has No Spurious Local Minima, Adversarial classification via distributional robustness with Wasserstein ambiguity, Second-Order Guarantees of Distributed Gradient Algorithms, Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization, Optimization for deep learning: an overview, Recent Theoretical Advances in Non-Convex Optimization, Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation, Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent, Smoothed amplitude flow-based phase retrieval algorithm, Exact matrix completion based on low rank Hankel structure in the Fourier domain, Communication-Efficient Distributed Eigenspace Estimation, Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees, Reconstruction of low-rank aggregation kernels in univariate population balance equations, Nonconvex Robust Low-Rank Matrix Recovery, Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data, Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence, A Riemannian rank-adaptive method for low-rank matrix completion, Exponential-Family Embedding With Application to Cell Developmental Trajectories for Single-Cell RNA-Seq Data, Rank $2r$ Iterative Least Squares: Efficient Recovery of Ill-Conditioned Low Rank Matrices from Few Entries, Unnamed Item, Non-convex exact community recovery in stochastic block model, Low-Rank Matrix Estimation from Rank-One Projections by Unlifted Convex Optimization, Unnamed Item, Low rank matrix recovery with adversarial sparse noise*