scientific article; zbMATH DE number 7255160
From MaRDI portal
Publication:4969223
André Belotto da Silva, Maxime Gazeau
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1810.13108
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
differential equationadaptive algorithmsfirst-order methodsconvex and non-convex optimizationforward Euler discretization
Related Items (3)
Friction-adaptive descent: a family of dynamics-based optimization methods ⋮ Convergence and Dynamical Behavior of the ADAM Algorithm for Nonconvex Stochastic Optimization ⋮ Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Introductory lectures on convex optimization. A basic course.
- Stochastic heavy ball
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Long time behaviour and stationary regime of memory gradient diffusions
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Asymptotics for a gradient system with memory term
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- On the long time behavior of second order differential equations with asymptotically small dissipation
- Compound Quadrature Rules for the Product of Two Functions
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
- A variational perspective on accelerated methods in optimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
This page was built for publication: