The all-or-nothing phenomenon in sparse linear regression
From MaRDI portal
Publication:2078961
DOI10.4171/MSL/22zbMath1493.62430arXiv1903.05046OpenAlexW4205482143MaRDI QIDQ2078961
Jiaming Xu, Ilias Zadik, Galen Reeves
Publication date: 4 March 2022
Published in: Mathematical Statistics and Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1903.05046
Linear regression; mixed models (62J05) Large deviations (60F10) Measures of information, entropy (94A17) Information theory (general) (94A15)
Related Items (5)
Sparse high-dimensional linear regression. Estimating squared error and a phase transition ⋮ The backbone method for ultra-high dimensional sparse machine learning ⋮ Variable selection, monotone likelihood ratio and group sparsity ⋮ Testing correlation of unlabeled random graphs ⋮ The all-or-nothing phenomenon in sparse linear regression
Uses Software
Cites Work
- Unnamed Item
- Reconstruction and estimation in the planted partition model
- Variable selection with Hamming loss
- Adaptive estimation of a quadratic functional by model selection.
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Detection boundary in sparse regression
- Optimal sparsity testing in linear regression model
- The all-or-nothing phenomenon in sparse linear regression
- Statistical limits of spiked tensor models
- Atomic Decomposition by Basis Pursuit
- Limits on Support Recovery With Probabilistic Models: An Information-Theoretic Framework
- Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
- Decoding by Linear Programming
- Randomly Spread CDMA: Asymptotics Via Statistical Physics
- Maxwell Construction: The Hidden Bridge Between Iterative and Maximuma PosterioriDecoding
- A statistical-mechanics approach to large-system analysis of CDMA multiuser detectors
- Information-Theoretic Bounds and Phase Transitions in Clustering, Sparse PCA, and Submatrix Localization
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Necessary and Sufficient Conditions for Sparsity Pattern Recovery
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Shannon-Theoretic Limits on Noisy Compressive Sampling
- Optimal Variable Selection and Adaptive Noisy Compressed Sensing
- Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation
- Least Squares Superposition Codes of Moderate Dictionary Size Are Reliable at Rates up to Capacity
- The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing
- Limits on Support Recovery of Sparse Signals via Multiple-Access Communication Techniques
- Nearly Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery
- Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding
- Information Theoretic Bounds for Compressed Sensing
- Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
- Fast Sparse Superposition Codes Have Near Exponential Error Probability for <formula formulatype="inline"><tex Notation="TeX">$R<{\cal C}$</tex></formula>
- Reed–Muller Codes Achieve Capacity on Erasure Channels
- Approximate Message-Passing Decoder and Capacity Achieving Sparse Superposition Codes
- The Distribution of a Quadratic Form of Normal Random Variables
- Compressed sensing
This page was built for publication: The all-or-nothing phenomenon in sparse linear regression