Iteratively Reweighted Group Lasso Based on Log-Composite Regularization
DOI10.1137/20M1349072zbMath1476.62157OpenAlexW3194354482MaRDI QIDQ5161764
Yifei Lou, Chengyu Ke, Sunyoung Shin, Miju Ahn
Publication date: 1 November 2021
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/20m1349072
variable selectiongroup sparsitynonconvex regularizationiteratively reweighted algorithmdirectional stationarity
Computational methods for problems pertaining to statistics (62-08) Ridge regression; shrinkage estimators (Lasso) (62J07) Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Genetics and epigenetics (92D10) Medical epidemiology (92C60)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Computing sparse representation in a highly coherent dictionary based on difference of \(L_1\) and \(L_2\)
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Global convergence of ADMM in nonconvex nonsmooth optimization
- Fast L1-L2 minimization via a proximal operator
- Minimization of transformed \(L_1\) penalty: theory, difference of convex function algorithm, and robust application in compressed sensing
- Consistency bounds and support recovery of d-stationary solutions of sparse sample average approximations
- Group variable selection via \(\ell_{p,0}\) regularization and application to optimal scoring
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Ratio and difference of \(l_1\) and \(l_2\) norms and sparse representation with coherent dictionaries
- Finding sparse solutions of systems of polynomial equations via group-sparsity optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- MM Optimization Algorithms
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Generalized Alternating Projection for Weighted-$\ell_{2,1}$ Minimization with Applications to Model-Based Compressive Sensing
- Computing B-Stationary Points of Nonsmooth DC Programs
- Adaptive estimation with partially overlapping models
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- The Group Lasso for Logistic Regression
- Atomic Decomposition by Basis Pursuit
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Local Strong Homogeneity of a Regularized Estimator
- Uncertainty principles and ideal atomic decomposition
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Accelerated Schemes for the $L_1/L_2$ Minimization
- Ensemble estimation and variable selection with semiparametric regression models
- A Scale-Invariant Approach for Sparse Signal Recovery
- Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity
- Structured Sparsity via Alternating Direction Methods
- Model Selection and Estimation in Regression with Grouped Variables
- Stable signal recovery from incomplete and inaccurate measurements
- Compressed sensing
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- A fast unified algorithm for solving group-lasso penalize learning problems
- A selective review of group selection in high-dimensional models
This page was built for publication: Iteratively Reweighted Group Lasso Based on Log-Composite Regularization