General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family
From MaRDI portal
Publication:5265816
DOI10.1080/03610918.2013.824586zbMath1328.62254OpenAlexW1976111164MaRDI QIDQ5265816
Publication date: 29 July 2015
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610918.2013.824586
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12)
Related Items (3)
Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients ⋮ A new approach of subgroup identification for high-dimensional longitudinal data ⋮ Variable selection in classification model via quadratic programming
Cites Work
- Greedy function approximation: A gradient boosting machine.
- The Adaptive Lasso and Its Oracle Properties
- Boosting algorithms: regularization, prediction and model fitting
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- Boosting additive models using component-wise P-splines
- Hedonic housing prices and the demand for clean air
- Arcing classifiers. (With discussion)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Boosting for high-dimensional linear models
- Regression and time series model selection in small samples
- Ideal spatial adaptation by wavelet shrinkage
- Model Selection and the Principle of Minimum Description Length
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Boosting With theL2Loss
- Applied Linear Regression
- Regularization and Variable Selection Via the Elastic Net
This page was built for publication: General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family