Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
DOI10.1137/22m1500496zbMath1522.90101arXiv2206.01209OpenAlexW4386291915MaRDI QIDQ6046830
Publication date: 6 September 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2206.01209
convex optimizationiteration complexityproximal augmented Lagrangian methodproximal gradient methodaccelerated first-order methodsoperation complexitylocally Lipschitz continuous gradient
Convex programming (90C25) Nonlinear programming (90C30) Optimality conditions and duality in mathematical programming (90C46) Numerical methods based on nonlinear programming (49M37)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- Iteration-complexity of first-order penalty methods for convex programming
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Monotone Operators and the Proximal Point Algorithm
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
This page was built for publication: Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient