A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
DOI10.1137/21M1411111zbMath1491.90157arXiv2007.14951OpenAlexW3045723702MaRDI QIDQ5072590
Yutong Dai, Daniel P. Robinson, Frank E. Curtis
Publication date: 29 April 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.14951
convex optimizationregularizationnonlinear optimizationlogistic regressionlinear regressionsparsityworst-case iteration complexitysubspace accelerationgroup regularizer
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Complexity and performance of numerical algorithms (65Y20)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- Gradient methods for minimizing composite functions
- Templates for convex cone problems with applications to sparse signal recovery
- A numerical study of limited memory BFGS methods
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
- Exploiting negative curvature in deterministic and stochastic optimization
- A second-order method for convex1-regularized optimization with active-set prediction
- Proximal Splitting Methods in Signal Processing
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Optimization with Sparsity-Inducing Penalties
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Inexact Newton Methods
- Trust Region Methods
- Sparse Reconstruction by Separable Approximation
- First-Order Methods in Optimization
- FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- De-noising by soft-thresholding
- A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions
- Model Selection and Estimation in Regression with Grouped Variables
- A fast unified algorithm for solving group-lasso penalize learning problems