Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
DOI10.1051/cocv/2017083zbMath1437.49045arXiv1706.05671OpenAlexW2963022670MaRDI QIDQ5107904
Hassan Riahi, Zaki Chbani, Hedy Attouch
Publication date: 29 April 2020
Published in: ESAIM: Control, Optimisation and Calculus of Variations (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.05671
subcritical casestructured convex optimizationNesterov methodaccelerated gradient methodFISTAinertial forward-backward algorithmsvanishing dampingproximal-based methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical methods based on nonlinear programming (49M37)
Related Items (56)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Introductory lectures on convex optimization. A basic course.
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity
- Performance of first-order methods for smooth convex minimization: a novel approach
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- On damped second-order gradient systems
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- Accelerated and Inexact Forward-Backward Algorithms
- Convex Optimization in Normed Spaces
- Evolution equations for maximal monotone operators: asymptotic analysis in continuous and discrete time
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- On the Long Time Behavior of Second Order Differential Equations with Asymptotically Small Dissipation
- New Proximal Point Algorithms for Convex Minimization
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- Optimisation and asymptotic stability
- Convergence Rates of Inertial Forward-Backward Algorithms
- Asymptotics for a second-order differential equation with nonautonomous damping and an integrable source term
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3