Analytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity condition
From MaRDI portal
Publication:2173914
DOI10.1016/j.automatica.2019.108715zbMath1440.93071arXiv1810.03229OpenAlexW2997973415WikidataQ126466333 ScholiaQ126466333MaRDI QIDQ2173914
Huaqing Xiong, Bin Hu, Wei Zhang, Yuejie Chi
Publication date: 17 April 2020
Published in: Automatica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.03229
Related Items (3)
A frequency-domain analysis of inexact gradient methods ⋮ A zeroing neural dynamics based acceleration optimization approach for optimizers in deep neural networks ⋮ Convergence of gradient algorithms for nonconvex \(C^{1+ \alpha}\) cost functions
Uses Software
Cites Work
- Unnamed Item
- On the Kalman-Yakubovich-Popov lemma
- Introductory lectures on convex optimization. A basic course.
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- On Fienup Methods for Sparse Phase Retrieval
- The Role of Convexity in Saddle-Point Dynamics: Lyapunov Function and Robustness
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems
- Non-convex low-rank matrix recovery with arbitrary outliers via median-truncated gradient descent
- Continuous-Time Accelerated Methods via a Hybrid Control Lens
- The Proximal Augmented Lagrangian Method for Nonsmooth Composite Optimization
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Matrix Completion From a Few Entries
- Some methods of speeding up the convergence of iteration methods
- Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
This page was built for publication: Analytical convergence regions of accelerated gradient descent in nonconvex optimization under regularity condition