Lyapunov stability of the subgradient method with constant step size
From MaRDI portal
Publication:6052064
DOI10.1007/s10107-023-01936-6arXiv2211.14850OpenAlexW4321111845MaRDI QIDQ6052064
Publication date: 23 October 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2211.14850
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On gradients of functions definable in o-minimal structures
- Subgradient methods for sharp weakly convex functions
- Geometric categories and o-minimal structures
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Convergence of constant step stochastic gradient descent for non-smooth non-convex functions
- Stochastic subgradient method converges on tame functions
- On the division of distributions by polynomials
- On the stable equilibrium points of gradient systems
- Curves of Descent
- Genericity in Polynomial Optimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Generic Differentiability of Lipschitzian Functions
- Constant step stochastic approximations involving differential inclusions: stability, long-run convergence and applications
- Stochastic Approximations and Differential Inclusions
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Sur le problème de la division
This page was built for publication: Lyapunov stability of the subgradient method with constant step size