Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Proofs of Shannon's Hölder's and generalized Hölder's inequalities via coding theory - MaRDI portal

Proofs of Shannon's Hölder's and generalized Hölder's inequalities via coding theory (Q2737618)

From MaRDI portal





scientific article; zbMATH DE number 1645791
Language Label Description Also known as
English
Proofs of Shannon's Hölder's and generalized Hölder's inequalities via coding theory
scientific article; zbMATH DE number 1645791

    Statements

    11 September 2002
    0 references
    Shannon inequality
    0 references
    Hölder's inequality
    0 references
    mean codeword length
    0 references
    exponential mean codeword length
    0 references
    uniquely decipherable code
    0 references
    measures of information
    0 references
    Kerridge's inaccuracy
    0 references
    directed divergence
    0 references
    0 references
    Proofs of Shannon's Hölder's and generalized Hölder's inequalities via coding theory (English)
    0 references
    Let \(x_1, x_2, \dots, x_n\) be a finite set of \(n\) input symbols to be encoded using an alphabet of \(D\) symbols. Kraft proved that there is a uniquely decodable code with codeword lengths \(\ell_1, \ell_2, \dots, \ell_n\) if and only if \(\sum_{i=1}^n D^{-\ell_i} \leq 1\). If \(p_1, p_2, \dots, p_n\) are the probabilities for the codewords of length \(\ell_1, \ell_2, \dots, \ell_n\), respectively, then the mean codeword length is given by \(L = \sum_{i=1}^n p_i \ell_i \). By minimizing \(L\) subject to the constraint \(\sum_{i=1}^n D^{-\ell_i} = k\) (where \(k \in (0, 1]\)) the author obtains the well known Shannon inequality NEWLINE\[NEWLINE\sum_{i=1}^n p_I \log {{p_i } \over {q_i}} \geq 0 . NEWLINE\]NEWLINE Similarly, minimizing the exponential mean codeword length NEWLINE\[NEWLINEL_{\alpha} = {{\alpha} \over {1-\alpha}} \log_D \left ( \sum_{i=1}^n p_i D^{{{1-\alpha} \over {\alpha}}\ell_i } \right) NEWLINE\]NEWLINE subject to the constraint \(\sum_{i=1}^n D^{-\ell_i} = k\), the author obtains measures of information such as Kerridge's inaccuracy and the directed divergence. Using the directed divergence, the author gives an information theoretic proof of the well-known Hölder's inequality. This is an interesting elementary paper, however there are several obvious misprints in it.NEWLINENEWLINEFor the entire collection see [Zbl 0960.00033].
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references