A large-sample model selection criterion based on Kullback's symmetric divergence
From MaRDI portal
Publication:1962213
DOI10.1016/S0167-7152(98)00200-4zbMath0955.62012OpenAlexW1966038674WikidataQ127305850 ScholiaQ127305850MaRDI QIDQ1962213
Publication date: 27 February 2001
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0167-7152(98)00200-4
relative entropyAkaike information criterionI-divergenceKullback-Leibler informationsimulationsAICJ-divergence
Related Items (26)
Optimal model averaging estimator for multinomial logit models ⋮ Information criteria for Fay-Herriot model selection ⋮ Variable selection in generalized random coefficient autoregressive models ⋮ Inference after separated hypotheses testing: an empirical investigation for linear models ⋮ Model selection criteria based on cross-validatory concordance statistics ⋮ Iterative Bias Correction of the Cross-Validation Criterion ⋮ Model selection criteria based on Kullback information measures for nonlinear regression ⋮ A new correction approach for information criteria to detect outliers in regression modeling ⋮ Variable selection using the EM and CEM algorithms in mixtures of linear mixed models ⋮ Information‐theoretic model‐averaged benchmark dose analysis in environmental risk assessment ⋮ On the selection of predictors by using greedy algorithms and information theoretic criteria ⋮ On goodness‐of‐fit measures for Poisson regression models ⋮ Unnamed Item ⋮ Tracking interval for selecting between non-nested models: an investigation for type II right censored data ⋮ Is First-Order Vector Autoregressive Model Optimal for fMRI Data? ⋮ Estimation of stationary autoregressive models with the Bayesian LASSO ⋮ A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression ⋮ Model selection in the presence of nonstationarity ⋮ The Kullback information criterion for mixture regression models ⋮ Order selection criteria for vector autoregressive models ⋮ An alternate version of the conceptual predictive statistic based on a symmetrized discrepancy measure ⋮ A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling ⋮ An alternative quasi likelihood approach, Bayesian analysis and data-based inference for model specification ⋮ Order selection in finite mixtures of linear regressions ⋮ Asymptotic bootstrap corrections of AIC for linear regression models ⋮ Closed Likelihood Ratio Testing Procedures to Assess Similarity of Covariance Matrices
Cites Work
- Modeling by shortest data description
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Estimating the dimension of a model
- Unifying the derivations for the Akaike and corrected Akaike information criteria.
- Fitting autoregressive models for prediction
- Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation
- ON THE UNBIASEDNESS PROPERTY OF AIC FOR EXACT OR APPROXIMATING LINEAR STOCHASTIC TIME SERIES MODELS
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Regression and time series model selection in small samples
- Some recent advances in time series modeling
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information
- Model Selection for Multivariate Regression in Small Samples
- A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
- Some Comments on C P
- On Information and Sufficiency
- An invariant form for the prior probability in estimation problems
- A new look at the statistical model identification
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A large-sample model selection criterion based on Kullback's symmetric divergence