On the quality of first-order approximation of functions with Hölder continuous gradient
From MaRDI portal
Publication:1985266
DOI10.1007/s10957-020-01632-xzbMath1436.90154arXiv2001.07946OpenAlexW3006648311MaRDI QIDQ1985266
Raphaël M. Jungers, Yu. E. Nesterov, Guillaume O. Berger, Pierre-Antoine Absil
Publication date: 7 April 2020
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.07946
Approximation methods and heuristics in mathematical programming (90C59) Programming in abstract spaces (90C48)
Related Items (1)
Uses Software
Cites Work
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- Ellipsoids of maximal volume in convex bodies
- Characterizations of inner product spaces
- Lectures on Modern Convex Optimization
- Worst-case evaluation complexity of regularization methods for smooth unconstrained optimization using Hölder continuous gradients
- Global rates of convergence for nonconvex optimization on manifolds
- Traces and Emergence of Nonlinear Programming
- Maxima for Graphs and a New Proof of a Theorem of Turán
This page was built for publication: On the quality of first-order approximation of functions with Hölder continuous gradient