FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience
From MaRDI portal
Publication:4638928
DOI10.1080/10556788.2017.1415336zbMath1390.49040OpenAlexW2794053546MaRDI QIDQ4638928
Tianyi Chen, Daniel P. Robinson, Frank E. Curtis
Publication date: 2 May 2018
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2017.1415336
convex optimizationnonlinear optimizationsparse optimizationactive-set methodssubspace minimizationmodel predictionreduced-space methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Nonsmooth analysis (49J52) Numerical methods based on nonlinear programming (49M37)
Related Items
A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, On the stationarity for nonlinear optimization problems with polyhedral constraints, A subspace-accelerated split Bregman method for sparse data recovery with joint \(\ell_1\)-type regularizers
Uses Software
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- A stabilized SQP method: superlinear convergence
- A second-order method for convex1-regularized optimization with active-set prediction
- A New Active Set Algorithm for Box Constrained Optimization
- Inexact Newton Methods
- Sparse Reconstruction by Separable Approximation
- A stabilized SQP method: global convergence
- De-noising by soft-thresholding
- A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions
- Unnamed Item