Task-oriented communication for multidevice cooperative edge inference
This paper investigates task-oriented communication for multi-device cooperative edge
inference, where a group of distributed low-end edge devices transmit the extracted features …
inference, where a group of distributed low-end edge devices transmit the extracted features …
An operational approach to information leakage
Given two random variables X and Y, an operational approach is undertaken to quantify the
“leakage” of information from X to Y. The resulting measure L (X→ Y) is called maximal …
“leakage” of information from X to Y. The resulting measure L (X→ Y) is called maximal …
The conditional entropy bottleneck
I Fischer - Entropy, 2020 - mdpi.com
Much of the field of Machine Learning exhibits a prominent set of failure modes, including
vulnerability to adversarial examples, poor out-of-distribution (OoD) detection …
vulnerability to adversarial examples, poor out-of-distribution (OoD) detection …
Strong Data Processing Inequalities and -Sobolev Inequalities for Discrete Channels
M Raginsky - IEEE Transactions on Information Theory, 2016 - ieeexplore.ieee.org
The noisiness of a channel can be measured by comparing suitable functionals of the input
and output distributions. For instance, the worst case ratio of output relative entropy to input …
and output distributions. For instance, the worst case ratio of output relative entropy to input …
An operational measure of information leakage
Given two discrete random variables X and Y, an operational approach is undertaken to
quantify the “leakage” of information from X to Y. The resulting measure ℒ (X→ Y) is called …
quantify the “leakage” of information from X to Y. The resulting measure ℒ (X→ Y) is called …
Common information, noise stability, and their extensions
Common information is ubiquitous in information theory and related areas such as
theoretical computer science and discrete probability. However, because there are multiple …
theoretical computer science and discrete probability. However, because there are multiple …
On universal features for high-dimensional learning and inference
SL Huang, A Makur, GW Wornell, L Zheng - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of identifying universal low-dimensional features from high-
dimensional data for inference tasks in settings involving learning. For such problems, we …
dimensional data for inference tasks in settings involving learning. For such problems, we …
An information theoretic interpretation to deep neural networks
With the unprecedented performance achieved by deep learning, it is commonly believed
that deep neural networks (DNNs) attempt to extract informative features for learning tasks …
that deep neural networks (DNNs) attempt to extract informative features for learning tasks …
Local Differential Privacy Is Equivalent to Contraction of an -Divergence
S Asoodeh, M Aliakbarpour… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
We investigate the local differential privacy (LDP) guarantees of a randomized privacy
mechanism via its contraction properties. We first show that LDP constraints can be …
mechanism via its contraction properties. We first show that LDP constraints can be …
Information contraction and decomposition
A Makur - 2019 - dspace.mit.edu
These inequalities can be tightened to produce" strong" data processing inequalities
(SDPIs), which are obtained by introducing appropriate channel-dependent or source …
(SDPIs), which are obtained by introducing appropriate channel-dependent or source …