A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
From MaRDI portal
Publication:6116244
DOI10.1137/22m1498486zbMath1522.90102arXiv2205.12772OpenAlexW4281562381MaRDI QIDQ6116244
Adrien B. Taylor, Unnamed Author, Francis Bach
Publication date: 11 August 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2205.12772
convex optimizationordinary differential equationsstochastic differential equationsfirst-order methodscontinuous-time modelsperformance estimationworst-case analyses
Analysis of algorithms and problem complexity (68Q25) Semidefinite programming (90C22) Convex programming (90C25) Nonlinear programming (90C30)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- A dynamical system associated with the fixed points set of a nonexpansive operator
- Understanding the acceleration phenomenon via high-resolution differential equations
- Performance of first-order methods for smooth convex minimization: a novel approach
- Linear convergence of first order methods for non-strongly convex optimization
- Second Order Forward-Backward Dynamical Systems For Monotone Inclusion Problems
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Acceleration of Stochastic Approximation by Averaging
- Applied Stochastic Differential Equations
- A variational perspective on accelerated methods in optimization
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems
- The Connections Between Lyapunov Functions for Some Optimization Algorithms and Differential Equations
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Some methods of speeding up the convergence of iteration methods
- Generalized Momentum-Based Methods: A Hamiltonian Perspective
- Numerical Analysis