GRID: a variable selection and structure discovery method for high dimensional nonparametric regression
From MaRDI portal
Publication:2196249
DOI10.1214/19-AOS1846zbMath1454.62125MaRDI QIDQ2196249
Maria Lucia Parrella, Francesco Giordano, Soumendra Nath Lahiri
Publication date: 28 August 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aos/1594972841
Nonparametric regression and quantile regression (62G08) Nonparametric hypothesis testing (62G10) Asymptotic properties of nonparametric inference (62G20) Hypothesis testing in multivariate analysis (62H15)
Related Items (4)
A nonparametric procedure for linear and nonlinear variable screening ⋮ High-dimensional local polynomial regression with variable selection and dimension reduction ⋮ Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity ⋮ High-dimensional local linear regression under sparsity and convex losses
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- Marginal empirical likelihood and sure independence feature screening
- A review on empirical likelihood methods for regression
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Component selection and smoothing in multivariate nonparametric regression
- Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property
- High-dimensional additive modeling
- Empirical likelihood and general estimating equations
- Local polynomial regression: Optimal kernels and asymptotic minimax efficiency
- Polynomial splines and their tensor products in extended linear modeling. (With discussions)
- Local polynomial fitting based on empirical likelihood
- Asymptotic properties of backfitting estimators
- Multivariate locally weighted least squares regression
- Optimal global rates of convergence for nonparametric regression
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Minimax-optimal nonparametric regression in high dimensions
- Local linear regression smoothers and their minimax efficiencies
- Rodeo: Sparse, greedy nonparametric regression
- High-dimensional graphs and variable selection with the Lasso
- Empirical likelihood confidence intervals for local linear smoothers
- Surface estimation, variable selection, and the nonparametric oracle property
- Linear or Nonlinear? Automatic Structure Discovery for Partially Linear Models
- DASSO: Connections Between the Dantzig Selector and Lasso
- Empirical likelihood ratio confidence intervals for a single functional
- Design-adaptive Nonparametric Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sparse Additive Models
- Variable Selection With the Strong Heredity Constraint and Its Oracle Property
- Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions
- Nonparametric Inferences for Additive Models
This page was built for publication: GRID: a variable selection and structure discovery method for high dimensional nonparametric regression