Post-model-selection inference in linear regression models: an integrated review
DOI10.1214/22-SS135zbMath1485.62091OpenAlexW4226356811MaRDI QIDQ2137823
Abbas Khalili, Masoud Asgharian, Dongliang Zhang
Publication date: 11 May 2022
Published in: Statistics Surveys (Search for Journal in Brave)
Full work available at URL: https://www.projecteuclid.org/journals/statistics-surveys/volume-16/issue-none/Post-model-selection-inference-in-linear-regression-models-An/10.1214/22-SS135.full
model selectionhigh-dimensional linear modelspost-selection inferencepopulation- and projection-based regression coefficients
Estimation in multivariate analysis (62H12) Parametric tolerance and confidence regions (62F25) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- Distribution-Free Predictive Inference For Regression
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- On various confidence intervals post-model-selection
- Exact post-selection inference, with application to the Lasso
- A mathematical introduction to compressive sensing
- Valid post-selection inference
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- High-dimensional variable selection
- Can one estimate the conditional distribution of post-model-selection estimators?
- Nonsingularity and symmetry for linear normal maps
- On the post selection inference constant under restricted isometry properties
- A unified theory of confidence regions and testing for high-dimensional estimating equations
- Uniform asymptotic inference and the bootstrap after model selection
- High-dimensional simultaneous inference with the bootstrap
- Exact post-selection inference for the generalized Lasso path
- Selective inference with a randomized response
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- Uniformly valid confidence intervals post-model-selection
- Valid post-selection inference in model-free linear regression
- A significance test for the lasso
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Power of the spacing test for least-angle regression
- Central limit theorems and bootstrap in high dimensions
- Valid confidence intervals for post-model-selection predictors
- Boosting for high-dimensional linear models
- Fisher in 1921
- On some non-linear elliptic differential functional equations
- UPPER BOUNDS ON THE MINIMUM COVERAGE PROBABILITY OF CONFIDENCE INTERVALS IN REGRESSION AFTER MODEL SELECTION
- Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- p-Values for High-Dimensional Regression
- Bootstrapping Lasso Estimators
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A Perturbation Method for Inference on Regularized Regression Estimates
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- Normal Maps Induced by Linear Transformations
- The Little Bootstrap and Other Methods for Dimensionality Selection in Regression: X-Fixed Prediction Error
- Inference after variable selection in linear regression models
- Science and Statistics
- The distribution of estimators after model selection:large and small sample results
- THE FINITE-SAMPLE DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS AND UNIFORM VERSUS NONUNIFORM APPROXIMATIONS
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Local Strong Homogeneity of a Regularized Estimator
- Selective inference with unknown variance via the square-root lasso
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- High-Dimensional Statistics
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Stability Selection
- Post‐selection inference for ‐penalized likelihood models
- Hypothesis testing in finite mixture of regressions: Sparsity and model selection uncertainty
- On the Length of Post-Model-Selection Confidence Intervals Conditional on Polyhedral Constraints
- Variable Selection with Error Control: Another Look at Stability Selection
- Double/debiased machine learning for treatment and structural parameters
- Statistical Foundations of Data Science
- Confidence Intervals for Sparse Penalized Regression With Random Designs
- Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
- Regularization and Variable Selection Via the Elastic Net
- Confidence Intervals and Regions for the Lasso by Using Stochastic Variational Inequality Techniques in Optimization
- ON THE COVERAGE PROBABILITY OF CONFIDENCE INTERVALS IN REGRESSION AFTER VARIABLE SELECTION
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- Asymptotics of Selective Inference
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- On the Large-Sample Minimal Coverage Probability of Confidence Intervals After Model Selection
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Post‐selection inference for changepoint detection algorithms with application to copy number variation data