Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit
From MaRDI portal
Publication:6046301
DOI10.1214/22-aos2194arXiv2105.13302MaRDI QIDQ6046301
Zhiqi Bu, Jason M. Klusowski, Weijie J. Su, Cynthia Rush
Publication date: 10 May 2023
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.13302
SLOPEphase transitionfalse discovery rateapproximate message passingtrue positive ratesorted \(\ell 1\) regularization
Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic distribution theory in statistics (62E20) Linear regression; mixed models (62J05) Parametric hypothesis testing (62F03)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A numerically stable dual method for solving strictly convex quadratic programs
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- qpOASES: a parametric active-set algorithm for~quadratic programming
- SLOPE-adaptive variable selection via convex optimization
- False discoveries occur early on the Lasso path
- Slope meets Lasso: improved oracle bounds and optimality
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- On the asymptotic properties of SLOPE
- Which bridge estimator is the best for variable selection?
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Adapting to unknown sparsity by controlling the false discovery rate
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Perturbation analysis of optimization problems in banach spaces
- Optimization Problems with Perturbations: A Guided Tour
- When is the first spurious variable selected by sequential regression procedures?
- Does SLOPE outperform bridge regression?
- The Price of Competition: Effect Size Heterogeneity Matters in High Dimensions
- Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing
- Group SLOPE – Adaptive Selection of Groups of Predictors
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Table of the zeros and weight factors of the first fifteen Laguerre polynomials
- Consistent parameter estimation for Lasso and approximate message passing
This page was built for publication: Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit