Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
From MaRDI portal
Publication:5146023
DOI10.1080/01621459.2020.1837138zbMath1452.62519OpenAlexW3117440047MaRDI QIDQ5146023
Kaizheng Wang, Cong Ma, Jianqing Fan
Publication date: 22 January 2021
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2020.1837138
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items
Cites Work
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Factor-Adjusted Regularized Model Selection
- Gaussian approximation of suprema of empirical processes
- Sub-Gaussian mean estimators
- Statistical inference based on ranks
- Are discoveries spurious? Distributions of maximum spurious correlations and their applications
- Challenging the empirical mean and empirical variance: a deviation study
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Adaptive Huber Regression
- Asymptotic Theory of Least Absolute Error Regression
- A New Principle for Tuning-Free Huber Regression
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Estimates of Location Based on Rank Tests
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models