Doubly penalized estimation in additive regression with high-dimensional data
DOI10.1214/18-AOS1757zbMath1436.62358arXiv1704.07229OpenAlexW2966000184MaRDI QIDQ2328052
Publication date: 9 October 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1704.07229
total variationSobolev spacehigh-dimensional datametric entropyreproducing kernel Hilbert spaceadditive modelpenalized estimationANOVA modeltrend filteringbounded variation space
Nonparametric regression and quantile regression (62G08) Parametric tolerance and confidence regions (62F25) Asymptotic distribution theory in statistics (62E20) Linear regression; mixed models (62J05) Generalized linear models (logistic models) (62J12) Robustness and adaptive procedures (parametric inference) (62F35) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22) Analysis of variance and covariance (ANOVA) (62J10)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- Minimax optimal rates of estimation in high dimensional additive models
- Sparsity in multiple kernel learning
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Component selection and smoothing in multivariate nonparametric regression
- Variable selection in nonparametric additive models
- High-dimensional additive modeling
- Additive regression and other nonparametric models
- Nonparametric regression under qualitative smoothness assumptions
- Locally adaptive regression splines
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Optimal global rates of convergence for nonparametric regression
- Weak convergence and empirical processes. With applications to statistics
- On the conditions used to prove oracle results for the Lasso
- Additive models with trend filtering
- Doubly penalized estimation in additive regression with high-dimensional data
- Minimax-optimal nonparametric regression in high dimensions
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Statistical inference in compound functional models
- Adaptive piecewise polynomial estimation via trend filtering
- $\ell_1$ Trend Filtering
- Sparse Additive Models
- The Partial Linear Model in High Dimensions
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Smoothing spline ANOVA models