Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning

From MaRDI portal
Publication:2039229

DOI10.1007/s10107-020-01501-5zbMath1471.65057arXiv1909.10300OpenAlexW3016321495MaRDI QIDQ2039229

Jérôme Bolte, Edouard Pauwels

Publication date: 2 July 2021

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1909.10300




Related Items (20)

Convergence of constant step stochastic gradient descent for non-smooth non-convex functionsConvergence of a Piggyback-Style Method for the Differentiation of Solutions of Standard Saddle-Point ProblemsModeling design and control problems involving neural network surrogatesGlobal convergence of the gradient method for functions definable in o-minimal structuresLyapunov stability of the subgradient method with constant step sizeSubgradient Sampling for Nonsmooth Nonconvex MinimizationAn Improved Unconstrained Approach for Bilevel OptimizationStochastic approximation with discontinuous dynamics, differential inclusions, and applicationsCertifying the Absence of Spurious Local Minima at InfinitySufficient Conditions for Instability of the Subgradient Method with Constant Step SizeDifferentiating Nonsmooth Solutions to Parametric Monotone Inclusion ProblemsConservative parametric optimality and the ridge method for tame min-max problemsDensities of almost surely terminating probabilistic programs are differentiable almost everywhereIncremental without replacement sampling in nonconvex optimizationAn Inertial Newton Algorithm for Deep LearningConvergence analysis for gradient flows in the training of artificial neural networks with ReLU activationPerturbed iterate SGD for Lipschitz continuous loss functionsThe Structure of Conservative Gradient FieldsLearning Maximally Monotone Operators for Image RecoveryExamples of Pathological Dynamics of the Subgradient Method for Lipschitz Path-Differentiable Functions


Uses Software


Cites Work


This page was built for publication: Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning