On the convergence analysis of the optimized gradient method
DOI10.1007/s10957-016-1018-7zbMath1360.90200DBLPjournals/jota/KimF17arXiv1510.08573OpenAlexW2953231121WikidataQ47864275 ScholiaQ47864275MaRDI QIDQ511969
Jeffrey A. Fessler, Donghwan Kim
Publication date: 23 February 2017
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.08573
worst-case performance analysisconvergence boundfirst-order algorithmssmooth convex minimizationoptimized gradient method
Analysis of algorithms and problem complexity (68Q25) Semidefinite programming (90C22) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Discrete approximations in optimal control (49M25)
Related Items (13)
Uses Software
Cites Work
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- An optimal variant of Kelley's cutting-plane method
- Gradient methods for minimizing composite functions
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The exact information-based complexity of smooth convex minimization
- Introductory lectures on convex optimization. A basic course.
- Performance of first-order methods for smooth convex minimization: a novel approach
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Some methods of speeding up the convergence of iteration methods
This page was built for publication: On the convergence analysis of the optimized gradient method