Selection of variables and dimension reduction in high-dimensional non-parametric regression
From MaRDI portal
Publication:1951796
DOI10.1214/08-EJS327zbMath1320.62085arXiv0811.1115MaRDI QIDQ1951796
Guillaume Lecué, Karine Bertin
Publication date: 24 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0811.1115
Related Items (16)
Statistical inference in sparse high-dimensional additive models ⋮ Improvement on LASSO-type estimator in nonparametric regression ⋮ Variable selection in heteroscedastic single-index quantile regression ⋮ GRID: a variable selection and structure discovery method for high dimensional nonparametric regression ⋮ Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property ⋮ High-dimensional local linear regression under sparsity and convex losses ⋮ Learning sparse gradients for variable selection and dimension reduction ⋮ Statistical inference in compound functional models ⋮ Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model ⋮ Variable selection of high-dimensional non-parametric nonlinear systems by derivative averaging to avoid the curse of dimensionality ⋮ Minimal conditions for consistent variable selection in high dimension ⋮ Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix ⋮ Variable selection with Hamming loss ⋮ Tight conditions for consistency of variable selection in the context of high dimensionality ⋮ Variable selection consistency of Gaussian process regression ⋮ Adaptive estimation of multivariate piecewise polynomials and bounded variation functions by optimal decision trees
Cites Work
- Unnamed Item
- Unnamed Item
- Minimax theory of image reconstruction
- Fast learning rates for plug-in classifiers
- Lasso-type recovery of sparse representations for high-dimensional data
- Robust reconstruction of functions by the local-approximation method
- Simultaneous analysis of Lasso and Dantzig selector
- Rodeo: Sparse, greedy nonparametric regression
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Learning Theory and Kernel Machines
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
This page was built for publication: Selection of variables and dimension reduction in high-dimensional non-parametric regression