Entropy and the consistent estimation of joint distributions (Q1336572)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Entropy and the consistent estimation of joint distributions |
scientific article; zbMATH DE number 681389
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Entropy and the consistent estimation of joint distributions |
scientific article; zbMATH DE number 681389 |
Statements
Entropy and the consistent estimation of joint distributions (English)
0 references
13 February 1995
0 references
The empirical \(k\)-block distribution \(\widehat\mu_ k(a^ k_ 1)\) of \(a^ n_ 1= (a_ 1,\dots,a_ n)\) is defined by its frequency appearing consecutively in the sample \((x_ 1,\dots,x_ n)\) of an ergodic finite alphabet process. \(k(n)\) \((\leq n)\) is said to be admissible for the corresponding ergodic measure \(\mu\) if \(\sum_{a^ k_ 1} |\widehat\mu_{k(n)}(a^ k_ 1)- \mu_{k(n)}(a^ k_ 1)|\to 0\) \((n\to \infty)\) a.s. It is proven that for the ergodic \(\mu\) with positive entropy \(H\), \(k(n)\) is not admissible if \(k(n)\geq \log n/(H- \varepsilon)\), while in case of the process being weak Bernoulli, which includes i.i.d. processes, \(\varphi\)-mixing processes, aperiodic Markov chains and functions thereof and aperiodic renewal processes, it is admissible if \(k(n)\leq \log n/(H+\varepsilon)\).
0 references
\(k\)-block distribution
0 references
ergodic finite alphabet process
0 references
admissible
0 references
ergodic measure
0 references
entropy
0 references
weak Bernoulli
0 references
\(\mu\)-mixing processes
0 references