An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems
From MaRDI portal
Publication:5216783
DOI10.1137/19M1243828zbMath1434.90091OpenAlexW3007949207MaRDI QIDQ5216783
No author found.
Publication date: 20 February 2020
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/19m1243828
decomposition methodhomotopy methodLassosparse optimization\(l_{1-2}\)-minimizationhighly coherent matrix
Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26)
Related Items
An Iterative Reduction FISTA Algorithm for Large-Scale LASSO ⋮ A reduced half thresholding algorithm
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- The Adaptive Lasso and Its Oracle Properties
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Matrix-free interior point method for compressed sensing problems
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Basic-set algorithm for a generalized linear complementarity problem
- Least angle regression. (With discussion)
- Quantitative robust uncertainty principles and optimally sparse decompositions
- Atomic Decomposition by Basis Pursuit
- A second-order method for convex1-regularized optimization with active-set prediction
- A Method for Finding Structured Sparse Solutions to Nonnegative Least Squares Problems with Applications
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- Hard Thresholding Pursuit: An Algorithm for Compressive Sensing
- The Split Bregman Method for L1-Regularized Problems
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Reconstruction by Separable Approximation
- <formula formulatype="inline"><tex Notation="TeX">$L_{1/2}$</tex> </formula> Regularization: Convergence of Iterative Half Thresholding Algorithm
- Sparse Recovery of Streaming Signals Using <formula formulatype="inline"><tex Notation="TeX">$\ell_1$</tex></formula>-Homotopy
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization
- Minimization of $\ell_{1-2}$ for Compressed Sensing
- Compressed sensing