Pivotal estimation via square-root lasso in nonparametric regression
DOI10.1214/14-AOS1204zbMath1321.62030arXiv1105.1475OpenAlexW2952248799MaRDI QIDQ2249850
Lie Wang, Victor Chernozhukov, Alexandre Belloni
Publication date: 3 July 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1105.1475
model selectionpivotal\(\sqrt{n}\)-consistency and asymptotic normality after model selection\(Z\)-estimation problemgeneric semiparametric problemnon-Gaussian heteroscedasticnonlinear instrumental variablesquare-root lasso
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Nonparametric robustness (62G35) Nonparametric estimation (62G05)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Large Sample Properties of Generalized Method of Moments Estimators
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Graph Selection with GGMselect
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Invertibility of random submatrices via tail-decoupling and a matrix Chernoff inequality
- Gaussian approximation of suprema of empirical processes
- Robust inference on average treatment effects with possibly more covariates than observations
- Statistics for high-dimensional data. Methods, theory and applications.
- Sparse recovery under matrix uncertainty
- \(\ell_{1}\)-penalization for mixture regression models
- Oracle inequalities and optimal inference under group sparsity
- Near-ideal model selection by \(\ell _{1}\) minimization
- Sparsity in penalized empirical risk minimization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Self-normalized Cramér-type large deviations for independent random variables.
- Weak convergence and empirical processes. With applications to statistics
- On the conditions used to prove oracle results for the Lasso
- Rank penalized estimators for high-dimensional matrices
- Least squares after model selection in high-dimensional sparse models
- Pivotal estimation via square-root lasso in nonparametric regression
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- Aggregation for Gaussian regression
- Introduction to empirical processes and semiparametric inference
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain
- Nemirovski's Inequalities Revisited
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Self-Normalized Processes
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- Root-N-Consistent Semiparametric Regression
- Efficiency Bounds for Semiparametric Regression
- The Maximum Likelihood and the Nonlinear Three-Stage Least Squares Estimator in the General Nonlinear Simultaneous Equation Model
- Asymptotic Statistics
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
- Introduction to nonparametric estimation