High-dimensional inference in misspecified linear models
From MaRDI portal
Publication:491406
DOI10.1214/15-EJS1041zbMath1327.62420arXiv1503.06426OpenAlexW1531316514MaRDI QIDQ491406
Sara van de Geer, Peter Bühlmann
Publication date: 25 August 2015
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1503.06426
Parametric tolerance and confidence regions (62F25) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items
Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data, High-dimensional inference for personalized treatment decision, Hierarchical inference for genome-wide association studies: a view on methodology with software, Debiasing the debiased Lasso with bootstrap, Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach, Semi-Supervised Linear Regression, Models as approximations. I. Consequences illustrated with linear regression, Double-estimation-friendly inference for high-dimensional misspecified models, High-dimensional simultaneous inference with the bootstrap, Goodness-of-Fit Tests for High Dimensional Linear Models, A High‐dimensional Focused Information Criterion, Debiasing the Lasso: optimal sample size for Gaussian designs, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Unnamed Item, Linear Hypothesis Testing in Dense High-Dimensional Linear Models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Confidence intervals for high-dimensional inverse covariance estimation
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Statistics for high-dimensional data. Methods, theory and applications.
- Minimal models for Hilbert modular surfaces of principal congruence subgroups
- High-dimensional variable selection
- Controlling the false discovery rate via knockoffs
- Bootstrapping regression models
- Multivariate adaptive regression splines
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Empirical likelihood-based inference under imputation for missing response data
- A significance test for the lasso
- Rejoinder: ``A significance test for the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- An ancillarity paradox which appears in multiple linear regression
- Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- p-Values for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A Perturbation Method for Inference on Regularized Regression Estimates
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Atomic Decomposition by Basis Pursuit
- False Discovery Rate–Adjusted Multiple Confidence Intervals for Selected Parameters
- Compressed sensing
- Discussion: ``A significance test for the lasso