An Introduction to Neural Data Compression
DOI10.1561/0600000107zbMath1529.68097arXiv2202.06533OpenAlexW4367018374MaRDI QIDQ6180224
Yibo Yang, Stephan Mandt, Lucas Theis
Publication date: 19 December 2023
Published in: Foundations and Trends® in Computer Graphics and Vision (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.06533
data compressioninformation theoryvariational inferencesource codingdeep learningrate-distortion theoryspeech/audio/image/video compression
Artificial neural networks and deep learning (68T07) Coding and information theory (compaction, compression, models of communication, encoding schemes, etc.) (aspects in computer science) (68P30) Machine vision and scene understanding (68T45) Source coding (94A29) Rate-distortion theory in information and communication theory (94A34)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On surrogate loss functions and \(f\)-divergences
- Auto-association by multilayer perceptrons and singular value decomposition
- Simple statistical gradient-following algorithms for connectionist reinforcement learning
- The Likelihood Encoder for Lossy Compression
- Lattice Coding for Signals and Networks
- On universal quantization
- On universal quantization by randomized uniform/lattice quantizers
- A universal algorithm for sequential data compression
- Arithmetic Coding
- Transform coding with backward adaptive updates
- A Feature-Enriched Completely Blind Image Quality Evaluator
- Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
- Quantization
- Lossy source coding
- Least squares quantization in PCM
- Efficient scalar quantization of exponential and Laplacian random variables
- Discrete Cosine Transform
- Elements of Information Theory
- A Method for the Construction of Minimum-Redundancy Codes
- An algorithm for computing the capacity of arbitrary discrete memoryless channels
- Computation of channel capacity and rate-distortion functions
This page was built for publication: An Introduction to Neural Data Compression