On the convergence of an active-set method for ℓ1minimization
From MaRDI portal
Publication:2905351
DOI10.1080/10556788.2011.591398zbMath1244.49055OpenAlexW2160632209MaRDI QIDQ2905351
Hongchao Zhang, Wotao Yin, ZaiWen Wen, Donald Goldfarb
Publication date: 27 August 2012
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2011.591398
global convergence\(l_1\)-regularized problemsubspace optimizationactive-set algorithm FPC-ASnon-monotone line search (NMLS)
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Numerical methods involving duality (49M29)
Related Items
An active set algorithm for nonlinear optimization with polyhedral constraints, An active set Newton-CG method for \(\ell_1\) optimization, Accelerating block coordinate descent methods with identification strategies, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, Adaptive projected gradient thresholding methods for constrained \(l_0\) problems, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, Gradient-based method with active set strategy for $\ell _1$ optimization, Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing, A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell_1\)-regularized least-squares, A new generalized shrinkage conjugate gradient method for sparse recovery, Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions, A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares, A parallel line search subspace correction method for composite convex optimization, A Smoothing Active Set Method for Linearly Constrained Non-Lipschitz Nonconvex Optimization, A second-order method for convex1-regularized optimization with active-set prediction, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, A Proximal Gradient Method for Ensemble Density Functional Theory, A decomposition method for Lasso problems with zero-sum constraint, Nomonotone spectral gradient method for sparse recovery, Combining line search and trust-region methods forℓ1-minimization
Uses Software
Cites Work
- A coordinate gradient descent method for nonsmooth separable minimization
- Algorithms for bound constrained quadratic programming problems
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- A New Active Set Algorithm for Box Constrained Optimization
- Proximal Thresholding Algorithm for Minimization over Orthonormal Bases
- Projected gradient methods for linearly constrained problems
- Two-Point Step Size Gradient Methods
- On the Identification of Active Constraints
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- On the Accurate Identification of Active Constraints
- Exposing Constraints
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- On the Convergence of Successive Linear-Quadratic Programming Algorithms