Beyond transmitting bits: Context, semantics, and task-oriented communications

D Gündüz, Z Qin, IE Aguerri, HS Dhillon… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Communication systems to date primarily aim at reliably communicating bit sequences.
Such an approach provides efficient engineering designs that are agnostic to the meanings …

An introduction to neural data compression

Y Yang, S Mandt, L Theis - Foundations and Trends® in …, 2023 - nowpublishers.com
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …

Lossy image compression with conditional diffusion models

R Yang, S Mandt - Advances in Neural Information …, 2024 - proceedings.neurips.cc
This paper outlines an end-to-end optimized lossy image compression framework using
diffusion generative models. The approach relies on the transform coding paradigm, where …

A review of change of variable formulas for generative modeling

U Köthe - arXiv preprint arXiv:2308.02652, 2023 - arxiv.org
Change-of-variables (CoV) formulas allow to reduce complicated probability densities to
simpler ones by a learned transformation with tractable Jacobian determinant. They are thus …

Faster relative entropy coding with greedy rejection coding

G Flamich, S Markou… - Advances in Neural …, 2024 - proceedings.neurips.cc
Relative entropy coding (REC) algorithms encode a sample from a target distribution $ Q $
using a proposal distribution $ P $ using as few bits as possible. Unlike entropy coding, REC …

On the rate-distortion-perception function

J Chen, L Yu, J Wang, W Shi, Y Ge… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Rate-distortion-perception theory extends Shannon's rate-distortion theory by introducing a
constraint on the perceptual quality of the output. The perception constraint complements the …

Universally quantized neural compression

E Agustsson, L Theis - Advances in neural information …, 2020 - proceedings.neurips.cc
A popular approach to learning encoders for lossy compression is to use additive uniform
noise during training as a differentiable approximation to test-time quantization. We …

Optimal compression of locally differentially private mechanisms

A Shah, WN Chen, J Balle… - International …, 2022 - proceedings.mlr.press
Compressing the output of $\epsilon $-locally differentially private (LDP) randomizers
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …

Fast relative entropy coding with a* coding

G Flamich, S Markou… - … on Machine Learning, 2022 - proceedings.mlr.press
Relative entropy coding (REC) algorithms encode a sample from a target distribution Q
using a proposal distribution P, such that the expected codelength is O (KL [Q|| P]). REC can …

Algorithms for the communication of samples

L Theis, NY Ahmed - International Conference on Machine …, 2022 - proceedings.mlr.press
The efficient communication of noisy data has applications in several areas of machine
learning, such as neural compression or differential privacy, and is also known as reverse …