Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
DOI10.1007/s10107-020-01501-5zbMath1471.65057arXiv1909.10300OpenAlexW3016321495MaRDI QIDQ2039229
Publication date: 2 July 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.10300
automatic differentiationClarke subdifferentialbackpropagation algorithmo-minimal structuresstochastic gradientdeep learningdefinable setsfirst order methodsnonsmooth stochastic optimization
Large-scale problems in mathematical programming (90C06) Numerical optimization and variational techniques (65K10) Learning and adaptive systems in artificial intelligence (68T05) Set-valued and variational analysis (49J53) Decomposition methods (49M27) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items (20)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On Lipschitz optimization based on gray-box piecewise linearization
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Tangencially continuous directional derivatives in nonsmooth analysis
- On gradients of functions definable in o-minimal structures
- Essentially smooth Lipschitz functions
- Integration of subdifferentials of lower semicontinuous functions on Banach spaces
- Geometric categories and o-minimal structures
- Stochastic subgradient method converges on tame functions
- On the maximal monotonicity of subdifferential mappings
- Generalized subdifferentials: a Baire categorical approach
- On stable piecewise linearization and generalized algorithmic differentiation
- Clarke Subgradients of Stratifiable Functions
- Evaluating Derivatives
- Optimization and nonsmooth analysis
- Nonsmooth Analysis: Differential Calculus of Nondifferentiable Mappings
- Analysis of recursive stochastic algorithms
- Variational Analysis
- A Chain Rule for Essentially Smooth Lipschitz Functions
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Optimization Methods for Large-Scale Machine Learning
- Variational Analysis of Regular Mappings
- Constant step stochastic approximations involving differential inclusions: stability, long-run convergence and applications
- Convergence and Dynamical Behavior of the ADAM Algorithm for Nonconvex Stochastic Optimization
- Stochastic Approximations and Differential Inclusions
- Learning representations by back-propagating errors
- Integrability of subdifferentials of directionally Lipschitz functions
- A Stochastic Approximation Method
- Proof of the gradient conjecture of R. Thom.
This page was built for publication: Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning