Estimating Mutual Information Via Kolmogorov Distance
From MaRDI portal
Publication:3549038
DOI10.1109/TIT.2007.903122zbMath1326.94038MaRDI QIDQ3549038
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items (15)
On some extremal problems for mutual information and entropy ⋮ Correlation distance and bounds for mutual information ⋮ Tight uniform continuity bounds for quantum entropies: conditional entropy, relative entropy distance and energy constraints ⋮ Optimal uniform continuity bound for conditional entropy of classical-quantum states ⋮ Continuity bounds on observational entropy and measured relative entropies ⋮ On one extremal problem for mutual information ⋮ Coupling of several random variables ⋮ Hadamard quantum broadcast channels ⋮ On coupling of probability distributions and estimating the divergence through variation ⋮ On Oblivious Transfer Capacity ⋮ On one extreme value problem for entropy and error probability ⋮ Mutual information of several random variables and its estimation via variation ⋮ Mutual information, variation, and Fano's inequality ⋮ Generalization of a Pinsker problem ⋮ Coupling of probability distributions and an extremal problem for the divergence
This page was built for publication: Estimating Mutual Information Via Kolmogorov Distance