Separable approximations and decomposition methods for the augmented Lagrangian
From MaRDI portal
Publication:2943840
DOI10.1080/10556788.2014.966824zbMath1325.90065arXiv1308.6774OpenAlexW2035233604MaRDI QIDQ2943840
Burak Büke, Rachael Tappenden, Peter Richtárik
Publication date: 4 September 2015
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1308.6774
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15)
Related Items
On optimal probabilities in stochastic coordinate descent methods ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ The \(p\)-Lagrangian relaxation for separable nonconvex MIQCQP problems ⋮ A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems ⋮ Matrix completion under interval uncertainty ⋮ Coordinate descent with arbitrary sampling I: algorithms and complexity† ⋮ Coordinate descent with arbitrary sampling II: expected separable overapproximation ⋮ Distributed primal–dual interior-point methods for solving tree-structured coupled convex problems using message-passing
Cites Work
- Unnamed Item
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Extended monotropic programming and duality
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- A diagonal quadratic approximation method for large scale linear programs
- Decomposition in large system optimization using the method of multipliers
- The use of Hestenes' method of multipliers to resolve dual gaps in engineering system optimization
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Multiplier and gradient methods
- The multiplier method of Hestenes and Powell applied to convex programming
- Alternating Direction Method with Gaussian Back Substitution for Separable Convex Programming
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Properties of an augmented Lagrangian for design optimization
- Improving ultimate convergence of an augmented Lagrangian method
- Single-Machine Scheduling Polyhedra with Precedence Constraints
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- An Extension of the DQA Algorithm to Convex Stochastic Programs
- A New Scenario Decomposition Method for Large-Scale Stochastic Optimization
- On Convergence of an Augmented Lagrangian Decomposition Method for Sparse Convex Optimization
- Convergence of a block coordinate descent method for nondifferentiable minimization