Inadmissibility of the corrected Akaike information criterion
From MaRDI portal
Publication:6201858
DOI10.3150/23-bej1638arXiv2211.09326OpenAlexW4391458284MaRDI QIDQ6201858
Publication date: 26 March 2024
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2211.09326
Akaike information criterionadmissibilityloss estimationKullback-Leibler discrepancycorrected Akaike information criterion
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Pitman closeness properties of Bayes shrinkage procedures in estimation and prediction
- Consistency of high-dimensional AIC-type and \(C_p\)-type criteria in multivariate linear regression
- Least squares model averaging by Mallows criterion
- Estimation of normal means: Frequentist estimation of loss
- Bayesian shrinkage prediction for the regression problem
- Multivariate empirical Bayes and estimation of covariance matrices
- Multivariate reduced-rank regression
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- On unbiased and improved loss estimation for the mean of a multivariate normal distribution with unknown variance.
- Improved loss estimation for a normal mean matrix
- On Bayes and unbiased estimators of loss
- Shrinkage estimation
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- A consistency property of the AIC for multivariate linear models when the dimension and the sample size are large
- Improved loss estimation for the lasso: a variable selection tool
- Information criteria and statistical modeling.
- On improved loss estimation for shrinkage estimators
- From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
- Regression and time series model selection in small samples
- Goodness of prediction fit
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Information criteria for the predictive evaluation of bayesian models
- Modified AIC and Cp in multivariate linear regression
- IMPROVED ESTIMATION OF THE EXPECTED KULLBACK–LEIBLER DISCREPANCY IN CASE OF MISSPECIFICATION
- Model Selection and Multimodel Inference
- Model Selection for Multivariate Regression in Small Samples
- Akaike's Information Criterion, Cp and Estimators of Loss for Elliptically Symmetric Distributions
- Estimation under matrix quadratic loss and matrix superharmonicity
- Least Squares Model Averaging
- Ancillary Statistics and Estimation of the Loss in Estimation Problems
- Empirical Bayes on vector observations: An extension of Stein's method
This page was built for publication: Inadmissibility of the corrected Akaike information criterion