Task-oriented communication for multidevice cooperative edge inference

J Shao, Y Mao, J Zhang - IEEE Transactions on Wireless …, 2022 - ieeexplore.ieee.org
This paper investigates task-oriented communication for multi-device cooperative edge
inference, where a group of distributed low-end edge devices transmit the extracted features …

An operational approach to information leakage

I Issa, AB Wagner, S Kamath - IEEE Transactions on …, 2019 - ieeexplore.ieee.org
Given two random variables X and Y, an operational approach is undertaken to quantify the
“leakage” of information from X to Y. The resulting measure L (X→ Y) is called maximal …

The conditional entropy bottleneck

I Fischer - Entropy, 2020 - mdpi.com
Much of the field of Machine Learning exhibits a prominent set of failure modes, including
vulnerability to adversarial examples, poor out-of-distribution (OoD) detection …

Strong Data Processing Inequalities and -Sobolev Inequalities for Discrete Channels

M Raginsky - IEEE Transactions on Information Theory, 2016 - ieeexplore.ieee.org
The noisiness of a channel can be measured by comparing suitable functionals of the input
and output distributions. For instance, the worst case ratio of output relative entropy to input …

An operational measure of information leakage

I Issa, S Kamath, AB Wagner - 2016 Annual Conference on …, 2016 - ieeexplore.ieee.org
Given two discrete random variables X and Y, an operational approach is undertaken to
quantify the “leakage” of information from X to Y. The resulting measure ℒ (X→ Y) is called …

Common information, noise stability, and their extensions

L Yu, VYF Tan - Foundations and Trends® in …, 2022 - nowpublishers.com
Common information is ubiquitous in information theory and related areas such as
theoretical computer science and discrete probability. However, because there are multiple …

On universal features for high-dimensional learning and inference

SL Huang, A Makur, GW Wornell, L Zheng - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of identifying universal low-dimensional features from high-
dimensional data for inference tasks in settings involving learning. For such problems, we …

An information theoretic interpretation to deep neural networks

X Xu, SL Huang, L Zheng, GW Wornell - Entropy, 2022 - mdpi.com
With the unprecedented performance achieved by deep learning, it is commonly believed
that deep neural networks (DNNs) attempt to extract informative features for learning tasks …

Local Differential Privacy Is Equivalent to Contraction of an -Divergence

S Asoodeh, M Aliakbarpour… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
We investigate the local differential privacy (LDP) guarantees of a randomized privacy
mechanism via its contraction properties. We first show that LDP constraints can be …

Information contraction and decomposition

A Makur - 2019 - dspace.mit.edu
These inequalities can be tightened to produce" strong" data processing inequalities
(SDPIs), which are obtained by introducing appropriate channel-dependent or source …