Some properties of the relative entropy density of arbitrary information source (Q922434)

From MaRDI portal





scientific article; zbMATH DE number 4168380
Language Label Description Also known as
English
Some properties of the relative entropy density of arbitrary information source
scientific article; zbMATH DE number 4168380

    Statements

    Some properties of the relative entropy density of arbitrary information source (English)
    0 references
    0 references
    1990
    0 references
    Let \(\{X_ n\), \(n\geq 1\}\) be an arbitrary information source with the alphabet \(S=\{1,2,...,m\}\) with the joint distribution \(P(X_ 1=x_ 1,...,X_ n=x_ n)=p(x_ 1,...,x_ n)>0,\) \(x_ i\in S\), \(1\leq i\leq n\), \(k\in S\), \(S_ n(k,\omega)\) be the number of occurrence of k in the partial sequence \(X_ 1(\omega),...,X_ n(\omega)\), \((p_{i1},p_{i2},...,p_{im}),\) \(p_{ik}>0\), \(i=1,2,...\), be a sequence of probability distributions on S. The difference \[ \phi_ n(\omega)=(1/n)\log p(X_ 1,...,X_ n)-(1/n)\sum^{n}_{i=1}\log p_{iX_ i} \] will be called entropy density deviation of \(\{X_ i\), \(1\leq i\leq n\}\) relative to the distribution \(\prod^{n}_{i=1}p_{ix_ i}\), and \(0\leq c\leq 1\) be a constant. The following theorems are proved: Let \[ S_*(k,c)=\{\omega:\;\liminf_{n\to \infty}(1/n)[S_ n(k,\omega)- \sum^{n}_{i=1}p_{ik}]\geq c\}, \] \[ S^*(k,c)=\{\omega:\;\limsup_{n\to \infty}(1/n)S_ n(k,\omega)- \sum^{n}_{i=1}p_{ik}]\leq -c\}, \] \(S_*(c)=\cup^{m}_{k=1}S_*(k,c)\), and \(S^*(c)=\cup^{m}_{k=1}S^*(k,c)\). Then \[ (1)\quad \liminf_{n\to \infty}\phi_ n(\omega)\geq (1+c)\log (1+c)-c\quad a.e.,\quad \omega \in S_*(c); \] (2) \(\liminf_{n\to \infty}\phi_ n(\omega)\geq (1-c)\log (1-c)+c\) a.e., \(\omega \in S^*(c)\) when \(0\leq c<1\), and \(\liminf_{n\to \infty}\phi_ n(\omega)\geq 1\) a.e., \(\omega \in S^*(1)\) when \(c=1\). Throughout this report the author deals with the underlying probability space ([0,1),\({\mathcal T},P)\), where \({\mathcal T}\) is the class of Lebesgue measurable sets in the interval [0,1), and P is the Lebesgue measure. The crucial part of the proof is the application of Lebesgue's theorem on differentiability of monotone functions to the study of a.e. convergence.
    0 references
    0 references
    relative entropy density
    0 references
    almost everywhere converge
    0 references
    information source
    0 references
    entropy density deviation
    0 references

    Identifiers