Generalization of Jeffreys Divergence-Based Priors for Bayesian Hypothesis Testing
From MaRDI portal
Publication:4632606
DOI10.1111/j.1467-9868.2008.00667.xzbMath1411.62042arXiv0801.4224OpenAlexW2114062412MaRDI QIDQ4632606
M. J. Bayarri, Gonzalo García-Donato
Publication date: 30 April 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0801.4224
Bayes factorsKullback-Leibler divergencemixture modelsintrinsic priorsinformation consistencyirregular models
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (14)
Objective Bayesian testing on the common mean of several normal distributions under divergence-based priors ⋮ Prior distributions for objective Bayesian analysis ⋮ Objective Bayesian testing for the correlation coefficient under divergence-based priors ⋮ Objective Bayesian testing for the linear combinations of normal means ⋮ Objective Bayesian inference for the intraclass correlation coefficient in linear models ⋮ The Effective Sample Size ⋮ Priors via imaginary training samples of sufficient statistics for objective Bayesian hypothesis testing ⋮ Objective Bayesian comparison of order-constrained models in contingency tables ⋮ Copula modelling with penalized complexity priors: the bivariate case ⋮ Bayesian age-stratified joinpoint regression model: an application to lung and brain cancer mortality ⋮ Penalising model component complexity: a principled, practical approach to constructing priors ⋮ On the existence of uniformly most powerful Bayesian tests with application to non-central chi-squared tests ⋮ A model selection approach for variable selection with censored data ⋮ Objective Bayesian tests for Fieller-Creasy problem
This page was built for publication: Generalization of Jeffreys Divergence-Based Priors for Bayesian Hypothesis Testing