A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
From MaRDI portal
Publication:959247
DOI10.1016/j.csda.2005.01.007zbMath1445.62011OpenAlexW2014257223MaRDI QIDQ959247
Bezza Hafidi, Abdallah Mkhadri
Publication date: 11 December 2008
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2005.01.007
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Linear regression; mixed models (62J05) Statistical aspects of information-theoretic topics (62B10)
Related Items (7)
Information criteria: how do they behave in different models? ⋮ Iterative Bias Correction of the Cross-Validation Criterion ⋮ Unnamed Item ⋮ Blind deconvolution of the aortic pressure waveform using the Malliavin calculus ⋮ Multichannel blind deconvolution using the stochastic calculus for the estimation of the central arterial pressure ⋮ The Kullback information criterion for mixture regression models ⋮ A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimating the dimension of a model
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- A large-sample model selection criterion based on Kullback's symmetric divergence
- Regression and time series model selection in small samples
- Modified AIC and Cp in multivariate linear regression
- Model Selection for Multivariate Regression in Small Samples
- A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
This page was built for publication: A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression