A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares
From MaRDI portal
Publication:2796799
DOI10.1137/141000737zbMath1333.65059arXiv1403.1738OpenAlexW3099937901MaRDI QIDQ2796799
Stefano Lucidi, Francesco Rinaldi, Marianna De Santis
Publication date: 30 March 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1403.1738
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
A two-stage active-set algorithm for bound-constrained optimization, A flexible coordinate descent method, Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search, An active set Newton-CG method for \(\ell_1\) optimization, Accelerating block coordinate descent methods with identification strategies, A block active set algorithm with spectral choice line search for the symmetric eigenvalue complementarity problem, Second-order orthant-based methods with enriched Hessian information for sparse \(\ell _1\)-optimization, An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization, ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, Minimization over the \(\ell_1\)-ball using an active-set non-monotone projected gradient, An active-set algorithmic framework for non-convex optimization problems over the simplex, Unnamed Item, A decomposition method for Lasso problems with zero-sum constraint
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Parallel coordinate descent methods for big data optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell_1\)-regularized least-squares
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A flexible coordinate descent method
- Quadratically and superlinearly convergent algorithms for the solution of inequality constrained minimization problems
- An active set feasible method for large-scale minimization problems with bound constraints
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- On the convergence of an active-set method for ℓ1minimization
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- Decoding by Linear Programming
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Sparse Reconstruction by Separable Approximation
- Parallel Selective Algorithms for Nonconvex Big Data Optimization
- Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Fast Image Recovery Using Variable Splitting and Constrained Optimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Benchmarking optimization software with performance profiles.