First-Order Methods for Problems with $O$(1) Functional Constraints Can Have Almost the Same Convergence Rate as for Unconstrained Problems
From MaRDI portal
Publication:5097011
DOI10.1137/20M1371579MaRDI QIDQ5097011
Publication date: 19 August 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.02282
Numerical mathematical programming methods (65K05) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30)
Related Items (2)
Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method ⋮ Decentralized Gradient Descent Maximization Method for Composite Nonconvex Strongly-Concave Minimax Problems
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Subgradient methods for saddle-point problems
- Introductory lectures on convex optimization. A basic course.
- Accelerated schemes for a class of variational inequalities
- Level-set methods for convex optimization
- A new algorithm for minimizing convex functions over convex sets
- A first-order primal-dual algorithm for convex problems with applications to imaging
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization
- Complexity of an inexact proximal-point penalty method for constrained smooth non-convex optimization
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Iteration complexity of inexact augmented Lagrangian methods for constrained convex programming
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Stochastic first-order methods for convex and nonconvex functional constrained optimization
- On the Implementation and Usage of SDPT3 – A Matlab Software Package for Semidefinite-Quadratic-Linear Programming, Version 4.0
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- On the Evaluation Complexity of Composite Function Minimization with Applications to Nonconvex Nonlinear Programming
- Approximate Primal Solutions and Rate Analysis for Dual Subgradient Methods
- On Vaidya's Volumetric Cutting Plane Method for Convex Programming
- A Level-Set Method for Convex Optimization with a Feasible Solution Path
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- A Primal-Dual Algorithm with Line Search for General Convex-Concave Saddle Point Problems
- Primal-Dual Stochastic Gradient Method for Convex Programs with Many Functional Constraints
- Complexity of a Quadratic Penalty Accelerated Inexact Proximal Point Method for Solving Linearly Constrained Nonconvex Composite Programs
- Neyman-Pearson classification, convexity and stochastic constraints
- Convex Analysis
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
This page was built for publication: First-Order Methods for Problems with $O$(1) Functional Constraints Can Have Almost the Same Convergence Rate as for Unconstrained Problems