A Justification for Applying the Principle of Minimum Relative Entropy to Information Integration Problems
DOI10.1080/01966324.1993.10737344zbMath0789.62004OpenAlexW2004529215WikidataQ58274893 ScholiaQ58274893MaRDI QIDQ4280084
Publication date: 14 March 1994
Published in: American Journal of Mathematical and Management Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01966324.1993.10737344
Dirichlet priormaximum likelihoodinformation integrationasymptotic equivalenceBayes estimateprinciple of minimum relative entropy
Bayesian inference (62F15) Bayesian problems; characterization of Bayes procedures (62C10) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- I-divergence geometry of probability distributions and minimization problems
- Evaluation of Porter's constant
- A comparison of the Shannon and Kullback information measures
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- Assessing Risks Through the Determination of Rare Event Probabilities
This page was built for publication: A Justification for Applying the Principle of Minimum Relative Entropy to Information Integration Problems