How much does your data exploration overfit? Controlling bias via information usage
Modern data is messy and high-dimensional, and it is often not clear a priori what are the
right questions to ask. Instead, the analyst typically needs to use the data to search for …
right questions to ask. Instead, the analyst typically needs to use the data to search for …
Existence of Stein kernels under a spectral gap, and discrepancy bounds
TA Courtade, M Fathi, A Pananjady - 2019 - projecteuclid.org
We establish existence of Stein kernels for probability measures on R^d satisfying a
Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures …
Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures …
[PDF][PDF] Equivalent characterization of reverse Brascamp-Lieb-type inequalities using information measures
Equivalent characterization of reverse Brascamp-Lieb-type inequalities using information
measures Page 1 Equivalent characterization of reverse Brascamp-Lieb-type inequalities using …
measures Page 1 Equivalent characterization of reverse Brascamp-Lieb-type inequalities using …
-Entropic Measures of Correlation
A measure of correlation is said to have the tensorization property if it does not change when
computed for iid copies. More precisely, a measure of correlation between two random …
computed for iid copies. More precisely, a measure of correlation between two random …
Quantum Markov monogamy inequalities
Markovianity lies at the heart of communication problems. This in turn makes the information-
theoretic characterization of Markov processes worthwhile. Data-processing inequalities are …
theoretic characterization of Markov processes worthwhile. Data-processing inequalities are …
Monotonicity of entropy and Fisher information: a quick proof via maximal correlation
TA Courtade - arXiv preprint arXiv:1610.04174, 2016 - arxiv.org
Monotonicity of Entropy and Fisher Information: A Quick Proof via Maximal Correlation Page 1
arXiv:1610.04174v1 [cs.IT] 13 Oct 2016 Communications in Information and Systems Volume …
arXiv:1610.04174v1 [cs.IT] 13 Oct 2016 Communications in Information and Systems Volume …
A mutual information inequality and some applications
In this paper we derive an inequality relating linear combinations of mutual information
between subsets of mutually independent random variables and an auxiliary random …
between subsets of mutually independent random variables and an auxiliary random …
Maximal correlation and the rate of Fisher information convergence in the central limit theorem
O Johnson - IEEE Transactions on Information Theory, 2020 - ieeexplore.ieee.org
We consider the behaviour of the Fisher information of scaled sums of independent and
identically distributed random variables in the Central Limit Theorem regime. We show how …
identically distributed random variables in the Central Limit Theorem regime. We show how …
New connections between the entropy power inequality and geometric inequalities
A Marsiglietti, V Kostina - 2018 IEEE International Symposium …, 2018 - ieeexplore.ieee.org
The entropy power inequality (EPI) has a fundamental role in Information Theory, and has
deep connections with famous geometric inequalities. In particular, it is often compared to …
deep connections with famous geometric inequalities. In particular, it is often compared to …