Bounds on Data Compression Ratio with a Given Tolerable Error Probability
From MaRDI portal
Publication:4266369
DOI10.1017/S0269964800005143zbMath0933.94015OpenAlexW2104430370MaRDI QIDQ4266369
Publication date: 30 September 1999
Published in: Probability in the Engineering and Informational Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1017/s0269964800005143
data compressionboundslarge deviationserror probabilityShannon's theorycompression rateCramer's functions
Large deviations (60F10) Error probability in coding theory (94B70) Coding theorems (Shannon theory) (94A24) Source coding (94A29) Rate-distortion theory in information and communication theory (94A34)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Universal almost sure data compression
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- A sandwich proof of the Shannon-McMillan-Breiman theorem
- The Individual Ergodic Theorem of Information Theory
- An Asymptotic Theory of Large Deviations for Markov Jump Processes
- The source coding theorem revisited: A combinatorial approach
- The error exponent for the noiseless encoding of finite ergodic Markov sources
- Error exponent for source coding with a fidelity criterion
- Sliding-block source coding
- Process definitions of distortion-rate functions and source coding theorems
- Reliability function of a discrete memoryless channel at rates above capacity (Corresp.)
- Coding of sources with unknown statistics--I: Probability of encoding error
- Computation of channel capacity and rate-distortion functions
- On the converse to the coding theorem for discrete memoryless channels (Corresp.)
- A coding theorem for discrete-time sources
- Handbook of stochastic methods for physics, chemistry and natural sciences.
This page was built for publication: Bounds on Data Compression Ratio with a Given Tolerable Error Probability