Entropy and compression: a simple proof of an inequality of Khinchin-Ornstein-Shields
From MaRDI portal
Publication:2190980
DOI10.1134/S0032946020010020zbMath1457.94060arXiv1907.04713OpenAlexW3099805774MaRDI QIDQ2190980
Filippo Mignosi, M. Spezialetti, Riccardo Aragona, Francesca Marzi
Publication date: 23 June 2020
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.04713
Coding and information theory (compaction, compression, models of communication, encoding schemes, etc.) (aspects in computer science) (68P30) Measures of information, entropy (94A17) Coding theorems (Shannon theory) (94A24) Source coding (94A29)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Universal almost sure data compression
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- A sandwich proof of the Shannon-McMillan-Breiman theorem
- The Shannon-McMillan theorem for ergodic quantum lattice systems
- A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem
- The Individual Ergodic Theorem of Information Theory
- Correction Notes: Correction to "The Individual Ergodic Theorem of Information Theory"
- The source coding theorem revisited: A combinatorial approach
- Universal codeword sets and representations of the integers
- Second-order noiseless source coding theorems
- New bounds on the expected length of one-to-one codes
- Compression and entropy
- Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics
- Elements of Information Theory
- A Note on the Ergodic Theorem of Information Theory
- Sample converses in source coding theory
- The Basic Theorems of Information Theory
- Information Theory
This page was built for publication: Entropy and compression: a simple proof of an inequality of Khinchin-Ornstein-Shields