Faster Lagrangian-Based Methods in Convex Optimization
From MaRDI portal
Publication:5062120
DOI10.1137/20M1375358zbMath1486.90149arXiv2010.14314OpenAlexW4214591022MaRDI QIDQ5062120
Publication date: 15 March 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.14314
nonsmooth optimizationLagrangian multiplier methodsaugmented Lagrangianalternating direction method of multiplierconvex composite minimizationproximal multiplier algorithmsfast nonergodic global rate of convergence
Related Items
Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method, A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm, New Primal-Dual Algorithms for a Class of Nonsmooth and Nonlinear Convex-Concave Minimax Problems, Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates, A golden ratio proximal alternating direction method of multipliers for separable convex optimization, GRPDA revisited: relaxed condition and connection to Chambolle-Pock's primal-dual algorithm, A primal-dual flow for affine constrained convex optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the ergodic convergence rates of a first-order primal-dual algorithm
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A proximal-based deomposition method for compositions method for convex minimization problems
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Accelerated alternating direction method of multipliers: an optimal \(O(1 / K)\) nonergodic analysis
- On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers
- Multiplier and gradient methods
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Lagrangian methods for composite optimization
- On Full Jacobian Decomposition of the Augmented Lagrangian Method for Separable Convex Programming
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Accelerated Optimization for Machine Learning
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Convex Analysis
- An introduction to continuous optimization for imaging
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming