Asymptotic biases of information and cross-validation criteria under canonical parametrization
From MaRDI portal
Publication:5078294
DOI10.1080/03610926.2017.1422759OpenAlexW2792051238MaRDI QIDQ5078294
Publication date: 23 May 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2017.1422759
infinitely divisibleAkaike information criterionexponential family of distributionsTakeuchi information criterionleave-\(k\)-out method
Multivariate distribution of statistics (62H10) Asymptotic properties of parametric estimators (62F12)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A class of cross-validatory model selection criteria
- Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy
- Bias correction of cross-validation criterion based on Kullback-Leibler information under a general condition
- Corrected versions of cross-validation criteria for selecting multivariate regression and growth curve models
- Cross-validation methods
- Extensions of Pearson's inequality between skewness and kurtosis to multivariate cases
- Corrected version of \(AIC\) for selecting multivariate normal linear regression models in a general nonnormal case
- Iterative Bias Correction of the Cross-Validation Criterion
- Bias Corrections of some Criteria for Selecting Multivariate Linear Models in a General Nonnormal Case
- Regression and time series model selection in small samples
- Modified AIC and Cp in multivariate linear regression
- ASYMPTOTIC CUMULANTS OF SOME INFORMATION CRITERIA
- Measures of multivariate skewness and kurtosis with applications
- Mean Square Error of Prediction as a Criterion for Selecting Variables
- On Information and Sufficiency
This page was built for publication: Asymptotic biases of information and cross-validation criteria under canonical parametrization