Fast gradient method for low-rank matrix estimation
From MaRDI portal
Publication:6111368
DOI10.1007/s10915-023-02266-7zbMath1527.65026arXiv2211.16236OpenAlexW4381054020MaRDI QIDQ6111368
Chengwei Pan, Hongyi Li, Zhen Peng, Di Zhao
Publication date: 6 July 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2211.16236
Riemannian optimizationlow-rank matrix estimationlocal convergence analysisadaptive restart schemeNesterov's accelerated Riemannian gradient
Cites Work
- Unnamed Item
- Unnamed Item
- Low-rank retractions: a survey and new results
- Adaptive restart of the optimized gradient method for convex optimization
- A direct proof and a generalization for a Kantorovich type inequality
- On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
- Fast Cadzow's algorithm and a gradient variant
- On the asymptotic convergence and acceleration of gradient methods
- A stochastic Nesterov's smoothing accelerated method for general nonsmooth constrained stochastic composite convex optimization
- Guarantees of Riemannian optimization for low rank matrix completion
- Matrix recipes for hard thresholding methods
- Accelerated alternating direction method of multipliers: an optimal \(O(1 / K)\) nonergodic analysis
- Linear and nonlinear programming
- Adaptive restart for accelerated gradient schemes
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
- Low-Rank Matrix Completion by Riemannian Optimization
- Normalized Iterative Hard Thresholding for Matrix Completion
- Projection-like Retractions on Matrix Manifolds
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- A variational perspective on accelerated methods in optimization
- Accelerated Optimization for Machine Learning
- Spectral Methods for Data Science: A Statistical Perspective
- A Second Order Primal-Dual Method for Nonsmooth Convex Composite Optimization
- An Introduction to Optimization on Smooth Manifolds
- An extension of fast iterative shrinkage‐thresholding algorithm to Riemannian optimization for sparse principal component analysis
- Improving “Fast Iterative Shrinkage-Thresholding Algorithm”: Faster, Smarter, and Greedier
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Some methods of speeding up the convergence of iteration methods
- A Variational Formulation of Accelerated Optimization on Riemannian Manifolds
- On the steepest descent algorithm for quadratic functions