On one extreme value problem for entropy and error probability
From MaRDI portal
Publication:2263005
DOI10.1134/S003294601403016zbMath1321.94025OpenAlexW2041262889MaRDI QIDQ2263005
Publication date: 17 March 2015
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s003294601403016
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
On some extremal problems for mutual information and entropy ⋮ On extreme values of the Rényi entropy under coupling of probability distributions
Cites Work
- Mutual information, variation, and Fano's inequality
- Generalization of a Pinsker problem
- On estimation of information via variation
- Estimating Mutual Information Via Kolmogorov Distance
- The Interplay Between Entropy and Variational Distance
- On the Interplay Between Conditional Entropy and Error Probability
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- Local Pinsker Inequalities via Stein's Discrete Density Approach
This page was built for publication: On one extreme value problem for entropy and error probability