Common information, noise stability, and their extensions
Common information is ubiquitous in information theory and related areas such as
theoretical computer science and discrete probability. However, because there are multiple …
theoretical computer science and discrete probability. However, because there are multiple …
On universal features for high-dimensional learning and inference
SL Huang, A Makur, GW Wornell, L Zheng - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of identifying universal low-dimensional features from high-
dimensional data for inference tasks in settings involving learning. For such problems, we …
dimensional data for inference tasks in settings involving learning. For such problems, we …
Converses for secret key agreement and secure computing
H Tyagi, S Watanabe - IEEE Transactions on Information …, 2015 - ieeexplore.ieee.org
We consider information theoretic secret key (SK) agreement and secure function
computation by multiple parties observing correlated data, with access to an interactive …
computation by multiple parties observing correlated data, with access to an interactive …
Notes on information-theoretic privacy
We investigate the tradeoff between privacy and utility in a situation where both privacy and
utility are measured in terms of mutual information. For the binary case, we fully characterize …
utility are measured in terms of mutual information. For the binary case, we fully characterize …
Common information and secret key capacity
H Tyagi - IEEE Transactions on Information Theory, 2013 - ieeexplore.ieee.org
We study the generation of a secret key of maximum rate by a pair of terminals observing
correlated sources and with the means to communicate over a noiseless public …
correlated sources and with the means to communicate over a noiseless public …
A lossy source coding interpretation of Wyner's common information
G Xu, W Liu, B Chen - IEEE Transactions on Information Theory, 2015 - ieeexplore.ieee.org
Wyner's common information was originally defined for a pair of dependent discrete random
variables. Its significance is largely reflected in, and also confined to, several existing …
variables. Its significance is largely reflected in, and also confined to, several existing …
Gray–Wyner and mutual information regions for doubly symmetric binary sources and Gaussian sources
L Yu - IEEE Transactions on Information Theory, 2023 - ieeexplore.ieee.org
Nonconvex optimization plays a key role in multi-user information theory and related fields,
but it is usually difficult to solve. The rate region of the Gray–Wyner source coding system (or …
but it is usually difficult to solve. The rate region of the Gray–Wyner source coding system (or …
The lossy common information of correlated sources
KB Viswanatha, E Akyol, K Rose - IEEE Transactions on …, 2014 - ieeexplore.ieee.org
The two most prevalent notions of common information (CI) are due to Wyner and Gács-
Körner and both the notions can be stated as two different characteristic points in the …
Körner and both the notions can be stated as two different characteristic points in the …
Assisted common information with an application to secure two-party sampling
VM Prabhakaran… - IEEE Transactions on …, 2014 - ieeexplore.ieee.org
An important subclass of secure multiparty computation is secure sampling: two parties
output samples of a pair of jointly distributed random variables such that neither party learns …
output samples of a pair of jointly distributed random variables such that neither party learns …
Second-order region for Gray–Wyner network
S Watanabe - IEEE Transactions on Information Theory, 2016 - ieeexplore.ieee.org
The coding problem over the Gray-Wyner network is studied from the second-order coding
rates perspective. A tilted information density for this network is introduced in the spirit of …
rates perspective. A tilted information density for this network is introduced in the spirit of …