Group sparse structural smoothing recovery: model, statistical properties and algorithm
From MaRDI portal
Publication:6570353
DOI10.1007/s11222-024-10438-0zbMATH Open1541.62021MaRDI QIDQ6570353
Publication date: 10 July 2024
Published in: Statistics and Computing (Search for Journal in Brave)
nonconvex optimizationrecovery boundalternative direction method of multipliers (ADMM)group sparse regularization
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Nonconvex programming, global optimization (90C26)
Cites Work
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- Recovery of sparsest signals via \(\ell^q \)-minimization
- Oracle inequalities and optimal inference under group sparsity
- The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms
- Properties and refinements of the fused Lasso
- Global convergence of ADMM in nonconvex nonsmooth optimization
- An inertial proximal partially symmetric ADMM-based algorithm for linearly constrained multi-block nonconvex optimization problems with applications
- Pathwise coordinate optimization
- Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- <formula formulatype="inline"><tex Notation="TeX">$L_{1/2}$</tex> </formula> Regularization: Convergence of Iterative Half Thresholding Algorithm
- Sparsity and Smoothness Via the Fused Lasso
- Convergence rate bounds for a proximal ADMM with over-relaxation stepsize parameter for solving nonconvex linearly constrained problems
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Iterative Alpha Expansion for Estimating Gradient-Sparse Signals from Linear Measurements
- Group sparse optimization via $\ell_{p,q}$ regularization
- Model Selection and Estimation in Regression with Grouped Variables
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- A fast unified algorithm for solving group-lasso penalize learning problems
- Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm
- Accelerated gradient methods for sparse statistical learning with nonconvex penalties
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
This page was built for publication: Group sparse structural smoothing recovery: model, statistical properties and algorithm