Bounds on the probability of error in terms of generalized information radii
From MaRDI portal
Publication:912072
DOI10.1016/0020-0255(90)90022-3zbMath0697.94004OpenAlexW2003566922MaRDI QIDQ912072
Publication date: 1990
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0020-0255(90)90022-3
Shannon's entropyprobability of errorJensen differenceinformation radiusentropy of degree \(\alpha \)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Trigonometric entropies, Jensen difference divergence measures, and error bounds
- Informational divergence and the dissimilarity of probability distributions
- Comments on “entropies of degree β and lower bounds for the average error rate”
- On the convexity of some divergence measures based on entropy functions
- Entropies of degree β and lower bounds for the average error rate
- f-entropies, probability of error, and feature selection
- Information radius
- On Information and Sufficiency
This page was built for publication: Bounds on the probability of error in terms of generalized information radii