On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance
From MaRDI portal
Publication:2190977
DOI10.1134/S0032946020010019zbMath1443.60015OpenAlexW3016623702MaRDI QIDQ2190977
Publication date: 23 June 2020
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0032946020010019
Probability distributions: general theory (60E05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
Remarks on reverse Pinsker inequalities ⋮ The \(f\)-divergence and coupling of probability distributions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the minimum \(f\)-divergence for given total variation
- On measures of information and their characterizations
- Optimal upper bounds for the divergence of finite-dimensional distributions under a given variational distance
- $f$ -Divergence Inequalities
- Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding
- Rényi Divergence and Kullback-Leibler Divergence
- On Divergences and Informations in Statistics and Information Theory
- On functionals satisfying a data-processing theorem
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- Sharp Inequalities for $f$-Divergences
This page was built for publication: On the maximum values of \(f\)-divergence and Rényi divergence under a given variational distance