Analysis of KNN information estimators for smooth distributions
KSG mutual information estimator, which is based on the distances of each sample to its k-th
nearest neighbor, is widely used to estimate mutual information between two continuous …
nearest neighbor, is widely used to estimate mutual information between two continuous …
Successive omniscience
C Chan, A Al-Bashabsheh, Q Zhou… - IEEE Transactions …, 2016 - ieeexplore.ieee.org
Because the exchange of information among all the users in a large network can take a long
time, a successive omniscience protocol is proposed. Namely, subgroups of users first …
time, a successive omniscience protocol is proposed. Namely, subgroups of users first …
On the optimality of secret key agreement via omniscience
For the multiterminal secret key agreement problem under a private source model, it is
known that the maximum key rate, ie, the secrecy capacity, can be achieved through …
known that the maximum key rate, ie, the secrecy capacity, can be achieved through …
Estimators for multivariate information measures in general probability spaces
A Rahimzamani, H Asnani… - Advances in Neural …, 2018 - proceedings.neurips.cc
Abstract Information theoretic quantities play an important role in various settings in machine
learning, including causality testing, structure inference in graphical models, time-series …
learning, including causality testing, structure inference in graphical models, time-series …
Geospatial visualization of indicators for the dynamics of innovation in an educational institution applying clustering techniques
PA Buitrago-Cadavid… - Journal of Physics …, 2023 - iopscience.iop.org
The activities of science, technology, and innovation are related to the execution of actions
involving research, experimental development, support for education and training, provision …
involving research, experimental development, support for education and training, provision …
Shared Information for the Cliqueylon Graph
S Bhattacharya, P Narayan - 2023 IEEE International …, 2023 - ieeexplore.ieee.org
Shared information is a measure of mutual dependence among m≥ 2 jointly distributed
discrete random variables. A new undirected probabilistic graphical model, a cliqueylon …
discrete random variables. A new undirected probabilistic graphical model, a cliqueylon …
Determining optimal rates for communication for omniscience
This paper considers the communication for omniscience problem: a set of users observe a
discrete memoryless multiple source and want to recover the entire multiple source via noise …
discrete memoryless multiple source and want to recover the entire multiple source via noise …
Shared information for a Markov chain on a tree
S Bhattacharya, P Narayan - IEEE Transactions on Information …, 2024 - ieeexplore.ieee.org
Shared information is a measure of mutual dependence among multiple jointly distributed
random variables with finite alphabets. For a Markov chain on a tree with a given joint …
random variables with finite alphabets. For a Markov chain on a tree with a given joint …
Universal joint image clustering and registration using multivariate information measures
RK Raman, LR Varshney - IEEE Journal of Selected Topics in …, 2018 - ieeexplore.ieee.org
We consider the problem of universal joint clustering and registration of images. Image
clustering focuses on grouping similar images, while image registration refers to the task of …
clustering focuses on grouping similar images, while image registration refers to the task of …
Secret key generation for minimally connected hypergraphical sources
Q Zhou, C Chan - IEEE Transactions on Information Theory, 2020 - ieeexplore.ieee.org
This paper investigates the secret key generation in the multiterminal source model, where
users observing correlated sources discuss interactively under limited rates to agree on a …
users observing correlated sources discuss interactively under limited rates to agree on a …