Rodeo: Sparse, greedy nonparametric regression
From MaRDI portal
Publication:2477052
DOI10.1214/009053607000000811zbMath1132.62026arXiv0803.1709OpenAlexW3099516248MaRDI QIDQ2477052
Larry Alan Wasserman, John D. Lafferty
Publication date: 12 March 2008
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0803.1709
nonparametric regressionvariable selectionsparsityminimax rates of convergencelocal linear smoothingbandwidth estimation
Nonparametric regression and quantile regression (62G08) Density estimation (62G07) Nonparametric hypothesis testing (62G10) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05)
Related Items
Fast Bayesian model assessment for nonparametric additive regression, A nonparametric procedure for linear and nonlinear variable screening, On the RODEO Method for Variable Selection, Variable selection in heteroscedastic single-index quantile regression, Bandwidth matrix selectors for kernel regression, Kernel estimation of regression function gradient, An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space, High-dimensional local polynomial regression with variable selection and dimension reduction, Variable Selection Via Thompson Sampling, Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity, GRID: a variable selection and structure discovery method for high dimensional nonparametric regression, Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property, High-dimensional local linear regression under sparsity and convex losses, Learning sparse gradients for variable selection and dimension reduction, Selection of variables and dimension reduction in high-dimensional non-parametric regression, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, Variable selection of high-dimensional non-parametric nonlinear systems by derivative averaging to avoid the curse of dimensionality, Additive prediction and boosting for functional data, Minimal conditions for consistent variable selection in high dimension, Challenging the curse of dimensionality in multivariate local linear regression, Input Variable Selection in Neural Network Models, On robust cross-validation for nonparametric smoothing, Locally modelled regression and functional data, Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix, Local Polynomials for Variable Selection, Lazy lasso for local regression, Sparse nonparametric graphical models, A spectral series approach to high-dimensional nonparametric regression, Variable selection with Hamming loss, Tight conditions for consistency of variable selection in the context of high dimensionality, Variable selection consistency of Gaussian process regression, Sparse nonparametric model for regression with functional covariate, Unnamed Item, Minimax-optimal nonparametric regression in high dimensions, Moving Least Squares Regression for High-Dimensional Stochastic Simulation Metamodeling
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bandwidth choice for nonparametric regression
- Multivariate adaptive regression splines
- Optimal spatial adaptation to inhomogeneous smoothness: An approach based on kernel estimates with variable bandwidth selectors
- Polynomial splines and their tensor products in extended linear modeling. (With discussions)
- Multivariate locally weighted least squares regression
- A distribution-free theory of nonparametric regression
- Asymptotics for Lasso-type estimators.
- Structure adaptive approach for dimension reduction.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- Greed is Good: Algorithmic Results for Sparse Approximation
- Just relax: convex programming methods for identifying sparse signals in noise
- Design-adaptive Nonparametric Regression
- 10.1162/15324430152748236
- Empirical-Bias Bandwidths for Local Polynomial Nonparametric Regression and Density Estimation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Model-Free Variable Selection
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Variable Selection and Model Building via Likelihood Basis Pursuit
- Component Identification and Estimation in Nonlinear High-Dimensional Regression Models by Structural Adaptation
- The elements of statistical learning. Data mining, inference, and prediction
- Nonlinear estimation in anisotropic multi-index denoising