Ridge regression revisited: debiasing, thresholding and bootstrap
From MaRDI portal
Publication:2148980
DOI10.1214/21-AOS2156MaRDI QIDQ2148980
Dimitris N. Politis, Yunyi Zhang
Publication date: 24 June 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2009.08071
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Regularized estimation in sparse high-dimensional time series models
- Model-free prediction and regression. A transformation-based approach to inference
- Exact post-selection inference, with application to the Lasso
- Statistical significance in high-dimensional linear models
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Estimation in high-dimensional linear models with deterministic design matrices
- On the prediction performance of the Lasso
- Statistics for high-dimensional data. Methods, theory and applications.
- Limit of the smallest eigenvalue of a large dimensional sample covariance matrix
- Robust trend inference with series variance estimator and testing-optimal smoothing parameter
- Lasso-type recovery of sparse representations for high-dimensional data
- Bootstrap procedures under some non-i.i.d. models
- Subsampling
- Uniform asymptotic inference and the bootstrap after model selection
- Gaussian approximation for high dimensional time series
- High-dimensional simultaneous inference with the bootstrap
- High-dimensional asymptotics of prediction: ridge regression and classification
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Broken adaptive ridge regression and its asymptotic properties
- Jackknife, bootstrap and other resampling methods in regression analysis
- Empirical process of residuals for high-dimensional linear models
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Robust inference via multiplier bootstrap
- On the asymptotic variance of the debiased Lasso
- Bootstrap and wild bootstrap for high dimensional linear models
- Simultaneous analysis of Lasso and Dantzig selector
- False discovery rate control via debiased Lasso
- High-dimensional generalized linear models and the lasso
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- High-dimensional graphs and variable selection with the Lasso
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Bootstrapping Lasso Estimators
- Scaled sparse linear regression
- Bootstrap Prediction Intervals for Regression
- Making wald tests work for cointegrated VAR systems
- Mathematical Statistics
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A heteroskedasticity and autocorrelation robustFtest using an orthonormal series variance estimator
- Deep Knockoffs
- BLOCK BOOTSTRAP HAC ROBUST TESTS: THE SOPHISTICATION OF THE NAIVE BOOTSTRAP
- Distribution-free Prediction Bands for Non-parametric Regression
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models