Informational divergence and the dissimilarity of probability distributions
From MaRDI portal
Publication:1164917
DOI10.1007/BF02576360zbMath0486.62006MaRDI QIDQ1164917
Publication date: 1981
Published in: Calcolo (Search for Journal in Brave)
symmetrypositivityseparation measuredissimilarity coefficientinformational divergenceKullback measure
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (5)
Can generalised divergences help for invariant neural networks? ⋮ Bounds on the probability of error in terms of generalized information radii ⋮ Mixedf-divergence and inequalities for log-concave functions ⋮ Generalized ‘useful’ non-symmetric divergence measures and inequalities ⋮ Generalized Jensen difference divergence measures and Fisher measure of information
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An informational divergence geometry for stochastic matrices
- f-dissimilarity: A generalization of the affinity of several distributions
- I-divergence geometry of probability distributions and minimization problems
- An application of informational divergence to Huffman codes
- Information radius
This page was built for publication: Informational divergence and the dissimilarity of probability distributions