Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures
L Kostal, G D'Onofrio - Biological Cybernetics, 2018 - Springer
The value of Shannon's mutual information is commonly used to describe the total amount of
information that the neural code transfers between the ensemble of stimuli and the …
information that the neural code transfers between the ensemble of stimuli and the …
How much information is associated with a particular stimulus?
DA Butts - Network: Computation in Neural Systems, 2003 - iopscience.iop.org
Although the Shannon mutual information can be used to reveal general features of the
neural code, it cannot directly address which symbols of the code are significant. Further …
neural code, it cannot directly address which symbols of the code are significant. Further …
Limitations to estimating mutual information in large neural populations
J Mölter, GJ Goodhill - Entropy, 2020 - mdpi.com
Information theory provides a powerful framework to analyse the representation of sensory
stimuli in neural population activity. However, estimating the quantities involved such as …
stimuli in neural population activity. However, estimating the quantities involved such as …
Stimulus reference frame and neural coding precision
L Kostal - Journal of Mathematical Psychology, 2016 - Elsevier
Any particular stimulus intensity, as a physical quantity, can be equivalently described in
different unit systems. Researchers automatically expect the methodology and the inference …
different unit systems. Researchers automatically expect the methodology and the inference …
Fisher and Shannon information in finite neural populations
The precision of the neural code is commonly investigated using two families of statistical
measures: Shannon mutual information and derived quantities when investigating very …
measures: Shannon mutual information and derived quantities when investigating very …
Summary of information theoretic quantities
Information theory is a practical and theoretical framework developed for the study of
communication over noisy channels. Its probabilistic basis and capacity to relate statistical …
communication over noisy channels. Its probabilistic basis and capacity to relate statistical …
Applications of information theory to analysis of neural data
Information theory is a practical and theoretical framework developed for the study of
communication over noisy channels. Its probabilistic basis and capacity to relate statistical …
communication over noisy channels. Its probabilistic basis and capacity to relate statistical …
Mutual information expansion for studying the role of correlations in population codes: how important are autocorrelations?
A Scaglione, G Foffani, G Scannella… - Neural …, 2008 - ieeexplore.ieee.org
The role of correlations in the activity of neural populations responding to a set of stimuli can
be studied within an information theory framework. Regardless of whether one approaches …
be studied within an information theory framework. Regardless of whether one approaches …
Mutual information, Fisher information, and efficient coding
XX Wei, AA Stocker - Neural computation, 2016 - direct.mit.edu
Fisher information is generally believed to represent a lower bound on mutual information
(Brunel & Nadal,), a result that is frequently used in the assessment of neural coding …
(Brunel & Nadal,), a result that is frequently used in the assessment of neural coding …
Quantifying the information transmitted in a single stimulus
M Bezzi - Biosystems, 2007 - Elsevier
Information theory–in particular mutual information–has been widely used to investigate
neural processing in various brain areas. Shannon mutual information quantifies how much …
neural processing in various brain areas. Shannon mutual information quantifies how much …