Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems

AB Barrett - Physical Review E, 2015 - APS
To fully characterize the information that two source variables carry about a third target
variable, one must decompose the total information into redundant, unique, and synergistic …

A novel approach to the partial information decomposition

A Kolchinsky - Entropy, 2022 - mdpi.com
We consider the “partial information decomposition”(PID) problem, which aims to
decompose the information that a set of source random variables provide about a target …

Introducing a differentiable measure of pointwise shared information

A Makkeh, AJ Gutknecht, M Wibral - Physical Review E, 2021 - APS
Partial information decomposition of the multivariate mutual information describes the
distinct ways in which a set of source variables contains information about a target variable …

Estimating the unique information of continuous variables

A Pakman, A Nejatbakhsh, D Gilboa… - Advances in neural …, 2021 - proceedings.neurips.cc
The integration and transfer of information from multiple sources to multiple targets is a core
motive of neural systems. The emerging field of partial information decomposition (PID) …

Generalised measures of multivariate information content

C Finn, JT Lizier - Entropy, 2020 - mdpi.com
The entropy of a pair of random variables is commonly depicted using a Venn diagram. This
representation is potentially misleading, however, since the multivariate mutual information …

Unique information and secret key agreement

RG James, J Emenheiser, JP Crutchfield - Entropy, 2018 - mdpi.com
The partial information decomposition (PID) is a promising framework for decomposing a
joint random variable into the amount of influence each source variable X i has on a target …

A measure of synergy based on union information

AFC Gomes, MAT Figueiredo - Entropy, 2024 - mdpi.com
The partial information decomposition (PID) framework is concerned with decomposing the
information that a set of (two or more) random variables (the sources) has about another …

Interactions of information transfer along separable causal paths

P Jiang, P Kumar - Physical Review E, 2018 - APS
Complex systems arise as a result of interdependences between multiple variables, whose
causal interactions can be visualized in a time-series graph. Transfer entropy and …

Quantifying reinforcement-learning agent's autonomy, reliance on memory and internalisation of the environment

A Ingel, A Makkeh, O Corcoll, R Vicente - Entropy, 2022 - mdpi.com
Intuitively, the level of autonomy of an agent is related to the degree to which the agent's
goals and behaviour are decoupled from the immediate control by the environment. Here …

The identity of information: how deterministic dependencies constrain information synergy and redundancy

D Chicharro, G Pica, S Panzeri - Entropy, 2018 - mdpi.com
Understanding how different information sources together transmit information is crucial in
many domains. For example, understanding the neural code requires characterizing how …