The entropy of a mixture of probability distributions (Q925718)

From MaRDI portal





scientific article; zbMATH DE number 5278265
Language Label Description Also known as
English
The entropy of a mixture of probability distributions
scientific article; zbMATH DE number 5278265

    Statements

    The entropy of a mixture of probability distributions (English)
    0 references
    0 references
    0 references
    22 May 2008
    0 references
    Summary: If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary'calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal \(\frac{1}{m+1}+\frac{1}{m+2}+\dots+\frac{1}{n}\), where \(n\) is the number of possible inputs and \(m\) the number of possible outcomes of the computation.
    0 references
    probability distribution
    0 references
    mixture distribution
    0 references
    Bhattacharyya space
    0 references
    Hilbert space
    0 references

    Identifiers