A divergence formula for randomness and dimension (Q616503)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: A divergence formula for randomness and dimension |
scientific article; zbMATH DE number 5834261
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | A divergence formula for randomness and dimension |
scientific article; zbMATH DE number 5834261 |
Statements
A divergence formula for randomness and dimension (English)
0 references
10 January 2011
0 references
The reviewer considers the first part of the author's abstract as most illustrative of the content of the paper. ``If \(S\) is an infinite sequence over a finite alphabet \(\Sigma\) and \(\beta\) is a probability measure on \(\Sigma\), then the \textit{dimension} of \(S\) with respect to \(\beta\), written \(\mathrm{dim}^\beta(S)\), is a constructive version of the Billingsley dimension that coincides with the (constructive Hausdorff) dimension \(\mathrm{dim}(S)\) when \(\beta\) is the uniform probability measure. This paper shows that \(\mathrm{dim}^\beta(S)\) and its dual \(\mathrm{Dim}^\beta(S)\), the \textit{strong dimension} of \(S\) with respect to \(\beta\), can be used in conjunction with randomness to measure the similarity of two probability measures \(\alpha\) and \(\beta\) on \(\Sigma\). Specifically, we prove that the divergence formula \[ \mathrm{dim}^\beta(R)=\mathrm{Dim}^\beta(R)= \frac{\mathcal H (\alpha)}{\mathcal H (\alpha)+ \mathcal{D}(\alpha\,\|\,\beta)} \] holds whenever \(\alpha\) and \(\beta\) are computable, positive probability measures on \(\Sigma\) and \(R\in \Sigma^\infty\) is random with respect to \(\alpha\). In this formula, \(\mathcal H (\alpha)\) is the Shannon entropy of \(\alpha\), and \(\mathcal D(\alpha\, \|\, \beta)\) is the Kullback-Leibler divergence between \(\alpha\) and \(\beta\).''
0 references
constructive dimension
0 references
finite-state dimension
0 references
Kolmogorov complexity
0 references
Kullback-Leibler divergence
0 references
randomness
0 references
Shannon entropy
0 references
0 references
1.0000002
0 references
0.8888997
0 references
0.8790914
0 references
0.8706936
0 references
0.86887157
0 references
0 references
0.85987574
0 references