General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
From MaRDI portal
Publication:2696991
DOI10.1007/s10957-023-02178-4OpenAlexW3155616566MaRDI QIDQ2696991
Publication date: 17 April 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.10196
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30)
Related Items (2)
Revisiting Spectral Bundle Methods: Primal-Dual (Sub)linear Convergence Rates ⋮ On optimal universal first-order methods for minimizing heterogeneous sums
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Universal gradient methods for convex optimization problems
- New fractional error bounds for polynomial systems with applications to Hölderian stability in optimization and spectral theory of tensors
- Finite termination of the proximal point algorithm
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- Introductory lectures on convex optimization. A basic course.
- Efficiency of proximal bundle methods
- From error bounds to the complexity of first-order descent methods for convex functions
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- New variants of bundle methods
- Restarting Frank-Wolfe: faster rates under Hölderian error bounds
- Rate of convergence of the bundle method
- Linear convergence of first order methods for non-strongly convex optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- A simple nearly optimal restart scheme for speeding up first-order methods
- Weak Sharp Minima in Mathematical Programming
- Optimal methods of smooth convex minimization
- Monotone Operators and the Proximal Point Algorithm
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- Sharpness, Restart, and Acceleration
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Minimization of unsmooth functionals
This page was built for publication: General Hölder smooth convergence rates follow from specialized rates assuming growth bounds