Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions
From MaRDI portal
Publication:6584749
DOI10.1007/s12190-024-02034-2zbMATH Open1542.90151MaRDI QIDQ6584749
Xian Zhang, Huiling Shen, Ding-Tao Peng
Publication date: 8 August 2024
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
sublinear convergence rateanti-outliernonsmooth loss functionsmoothing composite proximal gradient algorithmsparse group Lasso problem
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Sparse principal component analysis and iterative thresholding
- The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms
- Group variable selection via \(\ell_{p,0}\) regularization and application to optimal scoring
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Asymptotic theory of the adaptive sparse group Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A smoothing proximal gradient algorithm with extrapolation for the relaxation of \({\ell_0}\) regularization problem
- Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Tensor SVD: Statistical and Computational Limits
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- GESPAR: Efficient Phase Retrieval of Sparse Signals
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Group Sparse Recovery via the $\ell ^0(\ell ^2)$ Penalty: Theory and Algorithm
- Sparse Phase Retrieval: Uniqueness Guarantees and Recovery Algorithms
- Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
- Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems
- Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data
- Computation of second-order directional stationary points for group sparse optimization
- Alternating Structure-Adapted Proximal Gradient Descent for Nonconvex Nonsmooth Block-Regularized Problems
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Machine learning methods in the computational biology of cancer
- Model Selection and Estimation in Regression with Grouped Variables
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm
- Unnamed Item
- Unnamed Item
Related Items (1)
This page was built for publication: Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions