On joint conditional complexity (entropy)
From MaRDI portal
Publication:2510760
DOI10.1134/S008154381106006XzbMath1358.68151OpenAlexW2040214381MaRDI QIDQ2510760
Andrej A. Muchnik, Nikolai K. Vereshchagin
Publication date: 4 August 2014
Published in: Proceedings of the Steklov Institute of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s008154381106006x
Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (2)
Using entropy for quantitative measurement of operational complexity of supplier-customer system: case studies ⋮ Information disclosure in the framework of Kolmogorov complexity
Cites Work
- Shannon Entropy vs. Kolmogorov Complexity
- Information distance
- THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
- Conditional complexity and codes
- Logical operations and Kolmogorov complexity
This page was built for publication: On joint conditional complexity (entropy)