Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems
DOI10.1137/22m1498826MaRDI QIDQ6601206
Publication date: 10 September 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
iteration complexityadaptiveinexact proximal point methodparameter-freeoptimal complexityfirst-order accelerated gradient methodnonconvex composite optimization
Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Variational and other types of inclusions (47J22)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An adaptive accelerated first-order method for convex optimization
- OSGA: a fast subgradient algorithm with optimal complexity
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- Lectures on convex optimization
- Introductory lectures on convex optimization. A basic course.
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems
- An efficient adaptive accelerated inexact proximal point method for solving linearly constrained nonconvex composite problems
- Lower bounds for finding stationary points II: first-order methods
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Generalized uniformly optimal methods for nonlinear programming
- Efficiency of minimizing compositions of convex functions and smooth maps
- A Redistributed Proximal Bundle Method for Nonconvex Optimization
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- A Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational Inequalities
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Accelerated Methods for NonConvex Optimization
- First-Order Methods in Optimization
- Stochastic Model-Based Minimization of Weakly Convex Functions
- An Accelerated Composite Gradient Method for Large-Scale Composite Objective Problems
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
- An Accelerated Inexact Proximal Point Method for Solving Nonconvex-Concave Min-Max Problems
- Iteration-complexity of a Rockafellar's proximal method of multipliers for convex programming based on second-order approximations
- Complexity of a Quadratic Penalty Accelerated Inexact Proximal Point Method for Solving Linearly Constrained Nonconvex Composite Programs
- Regularized HPE-Type Methods for Solving Monotone Inclusions with Improved Pointwise Iteration-Complexity Bounds
- Iteration Complexity of an Inner Accelerated Inexact Proximal Augmented Lagrangian Method Based on the Classical Lagrangian Function
- Convex analysis and monotone operator theory in Hilbert spaces
- Iteration Complexity of a Proximal Augmented Lagrangian Method for Solving Nonconvex Composite Optimization Problems with Nonlinear Convex Constraints
This page was built for publication: Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems