Variable selection via adaptive false negative control in linear regression
From MaRDI portal
Publication:2283578
DOI10.1214/19-EJS1649zbMath1434.62156arXiv1804.07416OpenAlexW2995098115MaRDI QIDQ2283578
Publication date: 3 January 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1804.07416
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Estimating the proportion of false null hypotheses among a large number of independently tested hypotheses
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Pre-surgical fMRI data analysis using a spatially adaptive conditionally autoregressive model
- Statistics for high-dimensional data. Methods, theory and applications.
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- Controlling the false discovery rate via knockoffs
- SLOPE-adaptive variable selection via convex optimization
- Detection of sparse mixtures: higher criticism and scan statistic
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Higher criticism for detecting sparse heterogeneous mixtures.
- A strong law of large numbers related to multiple testing normal means
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Predictor ranking and false discovery proportion control in high-dimensional regression
- Properties of higher criticism under strong dependence
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Optimal Detection of Heterogeneous and Heteroscedastic Mixtures
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Estimating False Discovery Proportion Under Arbitrary Covariance Dependence
- Notes on Generating Functions of Polynomials: (2) Hermite Polynomials
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sequential Selection Procedures and False Discovery Rate Control
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models