Parameter-free accelerated gradient descent for nonconvex minimization
From MaRDI portal
Publication:6561381
DOI10.1137/22M1540934zbMATH Open1548.90404MaRDI QIDQ6561381
Publication date: 25 June 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Fast first-order methods for composite convex optimization with backtracking
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Updating the regularization parameter in the adaptive cubic regularization algorithm
- Lectures on convex optimization
- Templates for convex cone problems with applications to sparse signal recovery
- Adaptive regularization with cubics on manifolds
- Lower bounds for finding stationary points I
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
- First-order and stochastic optimization methods for machine learning
- Adaptive restart for accelerated gradient schemes
- Minimization of functions having Lipschitz continuous first partial derivatives
- Variable metric inexact line-search-based methods for nonsmooth optimization
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Accelerated Methods for NonConvex Optimization
- First-Order Methods in Optimization
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
- A Limited Memory Algorithm for Bound Constrained Optimization
- Ergodic Mirror Descent
- Finding approximate local minima faster than gradient descent
- Global rates of convergence for nonconvex optimization on manifolds
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
- An Average Curvature Accelerated Composite Gradient Method for Nonconvex Smooth Composite Optimization Problems
- Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives
- Convergence Rate Analysis of Several Splitting Schemes
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
This page was built for publication: Parameter-free accelerated gradient descent for nonconvex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6561381)