Optimal information criteria minimizing their asymptotic mean square errors
From MaRDI portal
Publication:506001
DOI10.1007/s13571-016-0115-9zbMath1358.62029OpenAlexW2341493365MaRDI QIDQ506001
Publication date: 27 January 2017
Published in: Sankhyā. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s13571-016-0115-9
asymptotic biasasymptotic mean square errorAkaike information criterionKullback-Leibler distanceTakeuchi information criterion
Asymptotic properties of parametric estimators (62F12) Point estimation (62F10) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
PREDICTIVE ESTIMATION OF A COVARIANCE MATRIX AND ITS STRUCTURAL PARAMETERS ⋮ An asymptotic equivalence of the cross-data and predictive estimators
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Asymptotic cumulants of the estimator of the canonical parameter in the exponential family
- Asymptotic expansions for the pivots using log-likelihood derivatives with an application in item response theory
- Estimating the dimension of a model
- Bootstrapping log likelihood and EIC, an extension of AIC
- Asymptotic theory for information criteria in model selection -- functional approach
- Information criteria and statistical modeling.
- Bias Adjustment Minimizing the Asymptotic Mean Square Error
- Regression and time series model selection in small samples
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Modified AIC and Cp in multivariate linear regression
- Generalised information criteria in model selection
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Some Comments on C P
- On Information and Sufficiency
This page was built for publication: Optimal information criteria minimizing their asymptotic mean square errors