Kullback-Leibler divergence to evaluate posterior sensitivity to different priors for autoregressive time series models
From MaRDI portal
Publication:5085931
DOI10.1080/03610918.2017.1410709OpenAlexW2791087833MaRDI QIDQ5085931
Publication date: 30 June 2022
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610918.2017.1410709
Jeffreys' priorKullback-Leibler divergencemultivariate t distributionnatural conjugate priorg-priordistance of posteriorsKullback-Leibler calibration
Related Items (3)
Bayesian identification of double seasonal autoregressive time series models ⋮ Gibbs sampling for Bayesian estimation of triple seasonal autoregressive models ⋮ Bayesian analysis of double seasonal autoregressive models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Kullback-Leibler divergence measure for multivariate skew-normal distributions
- Statistical decision theory and Bayesian analysis. 2nd ed
- Expressions for Rényi and Shannon entropies for multivariate distributions
- On maximum entropy characterization of Pearson's type II and VII multivariate distributions
- Bayesian inference for double SARMA models
- Bayesian Identification of Moving Average Models
- On Information and Sufficiency
- On Bayesian Identification of Autoregressive Processes
- Bayesian Inference for Double Seasonal Moving Average Models: A Gibbs Sampling Approach
- Benchmark priors for Bayesian model averaging.
This page was built for publication: Kullback-Leibler divergence to evaluate posterior sensitivity to different priors for autoregressive time series models