Entropic measures, Markov information sources and complexity
DOI10.1016/S0096-3003(01)00199-0zbMath1029.94008OpenAlexW2047066710WikidataQ57001704 ScholiaQ57001704MaRDI QIDQ1855845
Cristian S. Calude, Monica E. Bad Dumitrescu
Publication date: 28 January 2003
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0096-3003(01)00199-0
complexityalgorithmic probabilityShannon's entropyentropy rateMarkov sourcesentropic measuresMarkov information sources
Markov chains (discrete-time Markov processes on discrete state spaces) (60J10) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30) Measures of information, entropy (94A17) Continuous-time Markov processes on discrete state spaces (60J27) Statistical aspects of information-theoretic topics (62B10)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Kolmogorov's contributions to information theory and algorithmic complexity
- Algorithmic information and simplicity in statistical physics
- A proof of the Beyer-Stein-Ulam relation between complexity and entropy
- Some informational properties of Markov pure-jump processes
- Finite Continuous Time Markov Chains
- RANDOMNESS AND COMPLEXITY IN PURE MATHEMATICS
- CODING WITH MINIMAL PROGRAMS
This page was built for publication: Entropic measures, Markov information sources and complexity