Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems
From MaRDI portal
Publication:2374363
DOI10.1007/s10589-016-9853-xzbMath1357.90107arXiv1503.03520OpenAlexW2309836794WikidataQ59472409 ScholiaQ59472409MaRDI QIDQ2374363
Jacek Gondzio, Kimon Fountoulakis
Publication date: 15 December 2016
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1503.03520
ill-conditioned problemssecond-order methodsfirst-order methods\(\ell_1\)-regularised least squaressparse least squares instance generator
Related Items
A flexible coordinate descent method, Visualizing proportions and dissimilarities by space-filling maps: a large neighborhood search approach, A proximal interior point algorithm with applications to image processing, Visualizing data as objects by DC (difference of convex) optimization, Optimal solution for novel grey polynomial prediction model, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, A view of computational models for image segmentation, Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Parallel coordinate descent methods for big data optimization
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Gradient methods for minimizing composite functions
- Matrix-free interior point method
- A coordinate gradient descent method for nonsmooth separable minimization
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Introductory lectures on convex optimization. A basic course.
- Templates for convex cone problems with applications to sparse signal recovery
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Optimization with Sparsity-Inducing Penalties
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- A New Alternating Minimization Algorithm for Total Variation Image Reconstruction
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Deblurring Images
- Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage
- Constructing Test Instances for Basis Pursuit Denoising
- Fast Alternating Direction Optimization Methods
- Compressed sensing
- Convergence of a block coordinate descent method for nondifferentiable minimization