Bounds for entropy and divergence for distributions over a two-element set (Q2740879)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Bounds for entropy and divergence for distributions over a two-element set |
scientific article; zbMATH DE number 1641533
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Bounds for entropy and divergence for distributions over a two-element set |
scientific article; zbMATH DE number 1641533 |
Statements
15 October 2001
0 references
measures of information
0 references
inequalities
0 references
best constants
0 references
polynomials
0 references
Shannon entropy
0 references
Kullback divergence
0 references
0.8814424
0 references
0.8810613
0 references
0 references
0.8731467
0 references
0.8727041
0 references
0.8727041
0 references
0.86719286
0 references
0.86718476
0 references
Bounds for entropy and divergence for distributions over a two-element set (English)
0 references
Continuing joint research with \textit{P. Harremoës} [IEEE Trans. Inf. Theory 47 (2001)], the author offers the following inequalities for the two-term Shannon entropy \(H(p,q)=-p\log p -q \log q\) (transcribed from the \(\ln\) expressions in the paper; we write \(\log\) for \(\log_2\)): NEWLINE\[NEWLINE4pq\leq H(p,q)\leq (4pq)^{\log e/2},\quad \log p \log q \geq H(p,q) \geq \log p \log q / \log e.NEWLINE\]NEWLINE Extending results of \textit{I. Csiszár} [Stud. Sci. Math. Hung. 2, 299-318 (1967; Zbl 0157.25802)], \textit{O. Krafft} [Ann. Inst. Stat. Math. 21, 219-220 (1969; Zbl 0176.49106)] and others, best constants are also offered for approximation from below of the Kullback divergence \(D((p,q)||(r,s))=p\log(p/q)+r\log(r/s)\) by polynomials of eighth degree in \(V=|p-q|+|r-s|.\)
0 references