Hypercube estimators: penalized least squares, submodel selection, and numerical stability
From MaRDI portal
Publication:1621345
DOI10.1016/j.csda.2013.05.020zbMath1471.62027OpenAlexW2012132481MaRDI QIDQ1621345
Publication date: 8 November 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2013.05.020
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Linear smoothers and additive models
- Adaptation over parametric families of symmetric linear estimators
- A note on adaptive group Lasso
- Choosing a kernel regression estimator. With comments and a rejoinder by the authors
- Flexible smoothing with \(B\)-splines and penalties. With comments and a rejoinder by the authors
- Modulation of estimators and confidence sets.
- Smoothing spline ANOVA for exponential families, with application to the Wisconsin epidemiological study of diabetic retinopathy. (The 1994 Neyman Memorial Lecture)
- Semiparametric Regression
- Penalized regression with model-based penalties
- Some Comments on C P