Wavelet-based gradient boosting
From MaRDI portal
Publication:2631350
DOI10.1007/s11222-014-9474-0zbMath1342.62104OpenAlexW2033529337MaRDI QIDQ2631350
E. Dubossarsky, John T. Ormerod, Matthew P. Wand, Jerome H. Friedman
Publication date: 29 July 2016
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11222-014-9474-0
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Uses Software
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Boosting algorithms: regularization, prediction and model fitting
- Comment: Boosting algorithms: regularization, prediction and model fitting
- Asymptotics and optimal bandwidth selection for highest density region estimation
- Knot selection by boosting techniques
- Least angle regression. (With discussion)
- Penalized wavelets: embedding wavelets into semiparametric regression
- On the ``degrees of freedom of the lasso
- Boosting for high-dimensional linear models
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Ideal spatial adaptation by wavelet shrinkage
- Model Selection and the Principle of Minimum Description Length
- De-noising by soft-thresholding
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Wavelet-based gradient boosting