Entropy: An inequality (Q1825981)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Entropy: An inequality |
scientific article; zbMATH DE number 4122272
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Entropy: An inequality |
scientific article; zbMATH DE number 4122272 |
Statements
Entropy: An inequality (English)
0 references
1988
0 references
The authors provide a simple proof of an elementary inequality concerning entropy, which they have found useful in previous work on the Rudin- Shapiro sequence. Let \((p_ k| 0\leq k<\infty)\) be a probability vector satisfying for some \(\lambda >0,\quad \lambda p_ n\geq \sum^{\infty}_{k=n+1}p_ k\quad (n=0,1,2,...).\) The authors give a bound for \(\sum^{\infty}_{k=0}p_ k^{\alpha},\) with \(\alpha <1\) and utilize it to show \[ \sum^{\infty}_{k=0}p_ k \log(\frac{1}{p_ k})\leq \sum^{\infty}_{k=0}q_ k \log(\frac{1}{q_ k}), \] where \((q_ k)\) is a probability vector forming a geometric sequence for which \(\lambda q_ n=\sum^{\infty}_{k=n+1}q_ k\) \((n=0,1,2,...).\)
0 references
entropy of a probability distribution
0 references
Hölder inequality between
0 references
means
0 references
elementary inequality concerning entropy
0 references
Rudin-Shapiro sequence
0 references