An operational characterization of mutual information in algorithmic information theory
From MaRDI portal
Publication:5002777
DOI10.4230/LIPIcs.ICALP.2018.95zbMath1499.68156OpenAlexW2981786233MaRDI QIDQ5002777
Marius Zimand, Andrei Romashchenko
Publication date: 28 July 2021
Full work available at URL: https://doi.org/10.4230/lipics.icalp.2018.95
Cryptography (94A60) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30) Communication complexity, information complexity (68Q11)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Stability of properties of Kolmogorov complexity under relativization
- Conditionally-perfect secrecy and a provably-secure randomized cipher
- On common information
- A note on Kolmogorov complexity and entropy
- Pairs of words with nonmaterializable mutual information
- Optimal Rate Code Constructions for Computationally Simple Channels
- Secrecy Capacities for Multiple Terminals
- Privacy Amplification by Public Discussion
- A Theory of Program Size Formally Identical to Information Theory
- The Wire-Tap Channel
- Broadcast channels with confidential messages
- Secret key agreement by public discussion from common information
- Common randomness in information theory and cryptography. I. Secret sharing
- Kolmogorov Complexity and Algorithmic Randomness
- Kolmogorov complexity version of Slepian-Wolf coding
- Conditional Information Inequalities for Entropic and Almost Entropic Points
- Common Information and Secret Key Capacity
- Upper semi-lattice of binary strings with the relation ``\(x\) is simple conditional to \(y\)
This page was built for publication: An operational characterization of mutual information in algorithmic information theory