Iteration complexity of inexact augmented Lagrangian methods for constrained convex programming
DOI10.1007/s10107-019-01425-9zbMath1458.90518arXiv1711.05812OpenAlexW2969771825WikidataQ127350704 ScholiaQ127350704MaRDI QIDQ2220658
Publication date: 25 January 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.05812
first-order methoditeration complexityglobal convergence rateaugmented Lagrangian method (ALM)nonlinearly constrained problem
Analysis of algorithms (68W40) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Decomposition methods (49M27)
Related Items (14)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Iteration complexity analysis of multi-block ADMM for a family of convex minimization without strong convexity
- Gradient methods for minimizing composite functions
- Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization
- On the convergence of the exponential multiplier method for convex programming
- Inexact accelerated augmented Lagrangian methods
- Subgradient methods for saddle-point problems
- Introductory lectures on convex optimization. A basic course.
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Randomized primal-dual proximal block coordinate updates
- On the global and linear convergence of the generalized alternating direction method of multipliers
- Numerical comparison of augmented Lagrangian algorithms for nonconvex problems
- Multiplier and gradient methods
- The multiplier method of Hestenes and Powell applied to convex programming
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- Approximate Primal Solutions and Rate Analysis for Dual Subgradient Methods
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- New Proximal Point Algorithms for Convex Minimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Penalty/Barrier Multiplier Methods for Convex Programming Problems
- A dual approach to solving nonlinear programming problems by unconstrained optimization
- Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming
- Computational Complexity of Inexact Gradient Augmented Lagrangian Methods: Application to Constrained MPC
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming
- An Accelerated Linearized Alternating Direction Method of Multipliers
- On Alternating Direction Methods of Multipliers: A Historical Perspective
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Nonlinear Programming
- A Simple Parallel Algorithm with an $O(1/t)$ Convergence Rate for General Convex Programs
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
This page was built for publication: Iteration complexity of inexact augmented Lagrangian methods for constrained convex programming