On the selection of predictors by using greedy algorithms and information theoretic criteria
From MaRDI portal
Publication:6075184
DOI10.1111/anzs.12387zbMath1521.62159OpenAlexW4382791831MaRDI QIDQ6075184
Ciprian Doru Giurcăneanu, Christopher M. Triggs, Unnamed Author
Publication date: 20 October 2023
Published in: Australian & New Zealand Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/anzs.12387
Inference from stochastic processes and prediction (62M20) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy algorithms for prediction
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
- Degrees of freedom in lasso problems
- Boosting algorithms: regularization, prediction and model fitting
- Statistics for high-dimensional data. Methods, theory and applications.
- A new perspective on boosting in linear regression via subgradient optimization and relatives
- Modeling by shortest data description
- Estimating the dimension of a model
- Minimum message length inference of the Poisson and geometric models using heavy-tailed prior distributions
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Approximation and learning by greedy algorithms
- Extended Bayesian information criteria for model selection with large model spaces
- The horseshoe estimator for sparse signals
- The Bayesian Lasso
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Model Selection and the Principle of Minimum Description Length
- Boosting With theL2Loss
- MDL Denoising Revisited
- A Model Selection Criterion for High-Dimensional Linear Regression
- Stability Selection
- Matching pursuits with time-frequency dictionaries
- Variable Selection with Error Control: Another Look at Stability Selection
- A Small Sample Model Selection Criterion Based on Kullback's Symmetric Divergence
- A new look at the statistical model identification
This page was built for publication: On the selection of predictors by using greedy algorithms and information theoretic criteria