Subgradient Sampling for Nonsmooth Nonconvex Minimization
From MaRDI portal
Publication:6076858
DOI10.1137/22m1479178arXiv2202.13744OpenAlexW4281661032MaRDI QIDQ6076858
Tam Le, Jérôme Bolte, Edouard Pauwels
Publication date: 17 October 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.13744
stochastic gradientpath-differentiabilityconservative gradientonline deep learningsubgradient sampling
Analysis of algorithms and problem complexity (68Q25) Graph theory (including graph drawing) in computer science (68R10) Computer graphics; computational geometry (digital and algorithmic aspects) (68U05)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Stability under integration of sums of products of real globally subanalytic functions and their logarithms
- Topological aspects of nonsmooth optimization.
- Uniform laws of large numbers for set-valued mappings and subdifferentials of random functions
- Stochastic generalized gradient method for nonconvex nonsmooth stochastic optimization
- Ergodic properties of weak asymptotic pseudotrajectories for semiflows
- Stochastic heavy ball
- Correction to On the real exponential field with restricted analytic functions
- Geometric categories and o-minimal structures
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Stochastic generalized gradient methods for training nonconvex nonsmooth neural networks
- Convergence of constant step stochastic gradient descent for non-smooth non-convex functions
- Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization
- Stochastic subgradient method converges on tame functions
- Conservative and semismooth derivatives are equivalent for semialgebraic maps
- Path differentiability of ODE flows
- Stochastic proximal subgradient descent oscillates in the vicinity of its accumulation set
- Stochastic generalized-differentiable functions in the problem of nonconvex nonsmooth stochastic optimization
- Clarke Subgradients of Stratifiable Functions
- Optimization Methods for Large-Scale Machine Learning
- ERGODIC PROPERTIES OF WEAK ASYMPTOTIC PSEUDOTRAJECTORIES FOR SET-VALUED DYNAMICAL SYSTEMS
- A Stochastic Subgradient Method for Nonsmooth Nonconvex Multilevel Composition Optimization
- An Inertial Newton Algorithm for Deep Learning
- Stochastic Approximations and Differential Inclusions
- Learning representations by back-propagating errors
- A General Chain Rule for Derivatives and the Change of Variables Formula for the Lebesgue Integral
- Probability
- Conservative parametric optimality and the ridge method for tame min-max problems
This page was built for publication: Subgradient Sampling for Nonsmooth Nonconvex Minimization