One-step regularized estimator for high-dimensional regression models
From MaRDI portal
Publication:6621326
DOI10.5705/ss.202022.0065MaRDI QIDQ6621326
Xing-wei Tong, Donglin Zeng, Yi Wang, Yuanjia Wang
Publication date: 18 October 2024
Published in: STATISTICA SINICA (Search for Journal in Brave)
confidence intervalssemiparametric modelM-estimationhigh-dimension regressionone-step regularized estimators
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Exact post-selection inference, with application to the Lasso
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Statistics for high-dimensional data. Methods, theory and applications.
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Variable selection in nonparametric additive models
- Nonparametric maximum likelihood estimation by the method of sieves
- Estimating the dimension of a model
- Convergence rate of sieve estimates
- High-dimensional simultaneous inference with the bootstrap
- Asymptotics for Lasso-type estimators.
- Debiasing the Lasso: optimal sample size for Gaussian designs
- A significance test for the lasso
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Bootstrapping Lasso Estimators
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sieve Extremum Estimates for Weakly Dependent Data
- Penalized Composite Quasi-Likelihood for Ultrahigh Dimensional Variable Selection
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Double/debiased machine learning for treatment and structural parameters
- Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model
- Drawing inferences for high‐dimensional linear models: A selection‐assisted partial regression and smoothing approach
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Model Selection and Estimation in Regression with Grouped Variables
- Instrumental Variable Estimation of Nonparametric Models
- Exploration, normalization, and summaries of high density oligonucleotide array probe level data
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Inference in Additively Separable Models With a High-Dimensional Set of Conditioning Variables
This page was built for publication: One-step regularized estimator for high-dimensional regression models