Perturbed iterate SGD for Lipschitz continuous loss functions
From MaRDI portal
Publication:2093279
DOI10.1007/s10957-022-02093-0OpenAlexW3012552999MaRDI QIDQ2093279
Akiko Takeda, Michael R. Metel
Publication date: 7 November 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.07606
Analysis of algorithms and problem complexity (68Q25) Nonconvex programming, global optimization (90C26) Stochastic programming (90C15) Stochastic approximation (62L20)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On stochastic gradient and subgradient methods with adaptive steplength sequences
- Introductory lectures on convex optimization. A basic course.
- Stochastic calculus for finance. II: Continuous-time models.
- A neural network model with bounded-weights for pattern classification
- Geometric measure theory.
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Convergence of constant step stochastic gradient descent for non-smooth non-convex functions
- Lower bounds for finding stationary points I
- Stochastic subgradient method converges on tame functions
- Random gradient-free minimization of convex functions
- Decentralized Resource Allocation in Dynamic Networks of Agents
- Optimization of lipschitz continuous functions
- Variational Analysis
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Decomposition Methods for Computing Directional Stationary Solutions of a Class of Nonsmooth Nonconvex Optimization Problems
- The Minimization of Semicontinuous Functions: Mollifier Subgradients
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Understanding Machine Learning
- Approximating Subdifferentials by Random Sampling of Gradients
This page was built for publication: Perturbed iterate SGD for Lipschitz continuous loss functions