Beyond transmitting bits: Context, semantics, and task-oriented communications
Communication systems to date primarily aim at reliably communicating bit sequences.
Such an approach provides efficient engineering designs that are agnostic to the meanings …
Such an approach provides efficient engineering designs that are agnostic to the meanings …
An introduction to neural data compression
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …
methods to data compression. Recent advances in statistical machine learning have opened …
Lossy image compression with conditional diffusion models
This paper outlines an end-to-end optimized lossy image compression framework using
diffusion generative models. The approach relies on the transform coding paradigm, where …
diffusion generative models. The approach relies on the transform coding paradigm, where …
A review of change of variable formulas for generative modeling
U Köthe - arXiv preprint arXiv:2308.02652, 2023 - arxiv.org
Change-of-variables (CoV) formulas allow to reduce complicated probability densities to
simpler ones by a learned transformation with tractable Jacobian determinant. They are thus …
simpler ones by a learned transformation with tractable Jacobian determinant. They are thus …
Faster relative entropy coding with greedy rejection coding
Relative entropy coding (REC) algorithms encode a sample from a target distribution $ Q $
using a proposal distribution $ P $ using as few bits as possible. Unlike entropy coding, REC …
using a proposal distribution $ P $ using as few bits as possible. Unlike entropy coding, REC …
On the rate-distortion-perception function
Rate-distortion-perception theory extends Shannon's rate-distortion theory by introducing a
constraint on the perceptual quality of the output. The perception constraint complements the …
constraint on the perceptual quality of the output. The perception constraint complements the …
Universally quantized neural compression
E Agustsson, L Theis - Advances in neural information …, 2020 - proceedings.neurips.cc
A popular approach to learning encoders for lossy compression is to use additive uniform
noise during training as a differentiable approximation to test-time quantization. We …
noise during training as a differentiable approximation to test-time quantization. We …
Optimal compression of locally differentially private mechanisms
Compressing the output of $\epsilon $-locally differentially private (LDP) randomizers
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …
Fast relative entropy coding with a* coding
Relative entropy coding (REC) algorithms encode a sample from a target distribution Q
using a proposal distribution P, such that the expected codelength is O (KL [Q|| P]). REC can …
using a proposal distribution P, such that the expected codelength is O (KL [Q|| P]). REC can …
Algorithms for the communication of samples
The efficient communication of noisy data has applications in several areas of machine
learning, such as neural compression or differential privacy, and is also known as reverse …
learning, such as neural compression or differential privacy, and is also known as reverse …