Converting information into probability measures with the Kullback-Leibler divergence
From MaRDI portal
Publication:1925991
DOI10.1007/s10463-012-0350-4zbMath1253.62005OpenAlexW2065760629MaRDI QIDQ1925991
Pier Giovanni Bissiri, Stephen G. Walker
Publication date: 27 December 2012
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-012-0350-4
Bayesian problems; characterization of Bayes procedures (62C10) Statistical aspects of information-theoretic topics (62B10) General considerations in statistical decision theory (62C05)
Related Items (5)
Bayesian inference with misspecified models ⋮ On Bayesian learning via loss functions ⋮ On Bayesian learning from Bernoulli observations ⋮ Approximate models and robust decisions ⋮ On general Bayesian inference using loss functions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On Bayesian learning from Bernoulli observations
- Statistical decision theory. Foundations, concepts, and methods
- Expected information as ecpected utility
- Remarks on the measurement of subjective probability and information
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
This page was built for publication: Converting information into probability measures with the Kullback-Leibler divergence