Sparse high-dimensional linear regression. Estimating squared error and a phase transition
DOI10.1214/21-AOS2130zbMath1486.62200OpenAlexW4226485610MaRDI QIDQ2131259
Publication date: 25 April 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/21-aos2130
sparsityhigh-dimensional linear regressionsecond moment methodoverlap gap propertyall-or-nothing phenomenon
Linear regression; mixed models (62J05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Statistical aspects of information-theoretic topics (62B10) Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87)
Related Items (4)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- Iterative hard thresholding for compressed sensing
- A simple proof of the restricted isometry property for random matrices
- Dense subgraphs in random graphs
- On the conditions used to prove oracle results for the Lasso
- Finding a large submatrix of a Gaussian random matrix
- Local algorithms for independent sets are half-optimal
- The all-or-nothing phenomenon in sparse linear regression
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Model selection via multifold cross validation
- Counting the faces of randomly-projected hypercubes and orthants, with applications
- Simultaneous analysis of Lasso and Dantzig selector
- Accuracy assessment for high-dimensional linear regression
- Support union recovery in high-dimensional multivariate regression
- Confidence sets in sparse regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Performance of Sequential Local Algorithms for the Random NAE-$K$-SAT Problem
- Limits on Support Recovery With Probabilistic Models: An Information-Theoretic Framework
- Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
- Reconstruction and Clustering in Random Constraint Satisfaction Problems
- On independent sets in random graphs
- Decoding by Linear Programming
- An overview of recent developments in genomics and associated statistical methods
- The Variable Selection Problem
- Lattice decoding for joint detection in direct-sequence CDMA systems
- Information-Theoretic Bounds and Phase Transitions in Clustering, Sparse PCA, and Submatrix Localization
- Two-Sample Covariance Matrix Testing and Support Recovery in High-Dimensional and Sparse Settings
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
- Mutual information for low-rank even-order symmetric tensor estimation
- The Error Probability of Sparse Superposition Codes With Approximate Message Passing Decoding
- Sparse Regression Codes
- Least Squares Superposition Codes of Moderate Dictionary Size Are Reliable at Rates up to Capacity
- The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing
- Nearly Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding
- Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
- Fast Sparse Superposition Codes Have Near Exponential Error Probability for <formula formulatype="inline"><tex Notation="TeX">$R<{\cal C}$</tex></formula>
- Information-Theoretically Optimal Compressed Sensing via Spatial Coupling and Approximate Message Passing
- EigenPrism: Inference for High Dimensional Signal-to-Noise Ratios
- High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
- Stable signal recovery from incomplete and inaccurate measurements
- Elements of Information Theory
- A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables
- On the solution‐space geometry of random constraint satisfaction problems
- Compressed sensing
- Limits of local algorithms over sparse random graphs
This page was built for publication: Sparse high-dimensional linear regression. Estimating squared error and a phase transition