Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity
From MaRDI portal
Publication:5231668
DOI10.1137/18M117306XzbMath1421.90115arXiv1712.04104OpenAlexW2963531627WikidataQ127856914 ScholiaQ127856914MaRDI QIDQ5231668
Publication date: 27 August 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.04104
Convex programming (90C25) Methods of reduced gradient type (90C52) Numerical methods for variational inequalities and related problems (65K15)
Related Items (6)
Unified analysis of stochastic gradient methods for composite convex and smooth optimization ⋮ Stochastic approximation with discontinuous dynamics, differential inclusions, and applications ⋮ On optimal universal first-order methods for minimizing heterogeneous sums ⋮ Convergence Rates of Proximal Gradient Methods via the Convex Conjugate ⋮ A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods ⋮ Cyclic coordinate descent in the Hölder smooth setting
Uses Software
Cites Work
- Pegasos: primal estimated sub-gradient solver for SVM
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Nondifferentiable optimization and polynomial problems
- Introductory lectures on convex optimization. A basic course.
- From error bounds to the complexity of first-order descent methods for convex functions
- Linear convergence of first order methods for non-strongly convex optimization
- A Quasi-Newton Approach to Nonsmooth Convex Optimization Problems in Machine Learning
- An optimal algorithm for stochastic strongly-convex optimization
- Weak Sharp Minima in Mathematical Programming
- Optimal methods of smooth convex minimization
- Radial Subgradient Method
- On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- ``Efficient” Subgradient Methods for General Convex Optimization
- A Stochastic Approximation Method
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity