Square root LASSO: well-posedness, Lipschitz stability, and the tuning trade-off
DOI10.1137/23m1561968zbMATH Open1544.49022MaRDI QIDQ6580000
Aaron Berk, Simone Brugiapaglia, Tim Hoheisel
Publication date: 29 July 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
sensitivity analysisimplicit function theoremconvex analysisvariational analysisLipschitz stabilitysparse recoverysquare root LASSO
Ridge regression; shrinkage estimators (Lasso) (62J07) Convex programming (90C25) Sensitivity, stability, well-posedness (49K40) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Sampling theory in information and communication theory (94A20) Inverse problems in optimal control (49N45)
Cites Work
- Unnamed Item
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery
- The Lasso problem and uniqueness
- The sparsity of LASSO-type minimizers
- Pivotal estimation via square-root lasso in nonparametric regression
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- Stable recovery of analysis based approaches
- Simultaneous analysis of Lasso and Dantzig selector
- The degrees of freedom of partly smooth regularizers
- Correcting for unknown errors in sparse high-dimensional function approximation
- Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing
- Uniqueness in nuclear norm minimization: flatness of the nuclear norm sphere and simultaneous polarization
- Low Complexity Regularization of Linear Inverse Problems
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Variational Analysis
- Selective inference with unknown variance via the square-root lasso
- Variational Analysis and Applications
- New Method of Sparse Parameter Estimation in Separable Models and Its Use for Spectral Analysis of Irregularly Sampled Data
- Sharp Oracle Inequalities for Square Root Regularization
- Compressive Imaging: Structure, Sampling, Learning
- The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
- Robust instance-optimal recovery of sparse signals at unknown noise levels
- Sparse Polynomial Approximation of High-Dimensional Functions
- On the Best Choice of Lasso Program Given Data Parameters
- Sensitivity of ℓ1 minimization to parameter choice
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Implicit Functions and Solution Mappings
- Convex Analysis
- Compressed sensing
- From Perspective Maps to Epigraphical Projections
- Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
- LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing
This page was built for publication: Square root LASSO: well-posedness, Lipschitz stability, and the tuning trade-off