Divergence measures for statistical data processing—An annotated bibliography
M Basseville - Signal Processing, 2013 - Elsevier
Divergence measures for statistical data processing—An annotated bibliography -
ScienceDirect Skip to main contentSkip to article Elsevier logo Journals & Books Search …
ScienceDirect Skip to main contentSkip to article Elsevier logo Journals & Books Search …
Rényi divergence and Kullback-Leibler divergence
T Van Erven, P Harremos - IEEE Transactions on Information …, 2014 - ieeexplore.ieee.org
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is
related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as …
related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as …
[图书][B] Information theory: coding theorems for discrete memoryless systems
I Csiszár, J Körner - 2011 - books.google.com
Csiszár and Körner's book is widely regarded as a classic in the field of information theory,
providing deep insights and expert treatment of the key theoretical issues. It includes in …
providing deep insights and expert treatment of the key theoretical issues. It includes in …
[PDF][PDF] Lecture notes on information theory
Y Polyanskiy, Y Wu - Lecture Notes for ECE563 (UIUC) and, 2014 - Citeseer
“There is a whole book of readymade, long and convincing, lavishly composed telegrams for
all occasions. Sending such a telegram costs only twenty-five cents. You see, what gets …
all occasions. Sending such a telegram costs only twenty-five cents. You see, what gets …
On divergences and informations in statistics and information theory
F Liese, I Vajda - IEEE Transactions on Information Theory, 2006 - ieeexplore.ieee.org
The paper deals with the f-divergences of CsiszÁr generalizing the discrimination
information of Kullback, the total variation distance, the Hellinger divergence, and the …
information of Kullback, the total variation distance, the Hellinger divergence, and the …
On the empirical estimation of integral probability metrics
Given two probability measures, P and Q defined on a measurable space, S, the integral
probability metric (IPM) is defined as F (P, Q)=\sup\left {\left | S f\, d PS f\, d Q\right |\,:\, f ∈ …
probability metric (IPM) is defined as F (P, Q)=\sup\left {\left | S f\, d PS f\, d Q\right |\,:\, f ∈ …
[图书][B] Information theory and the central limit theorem
O Johnson - 2004 - books.google.com
This book provides a comprehensive description of a new method of proving the central limit
theorem, through the use of apparently unrelated results from information theory. It gives a …
theorem, through the use of apparently unrelated results from information theory. It gives a …
-Divergence Inequalities
This paper develops systematic approaches to obtain f-divergence inequalities, dealing with
pairs of probability measures defined on arbitrary alphabets. Functional domination is one …
pairs of probability measures defined on arbitrary alphabets. Functional domination is one …
On integral probability metrics,\phi-divergences and binary classification
BK Sriperumbudur, K Fukumizu, A Gretton… - arXiv preprint arXiv …, 2009 - arxiv.org
A class of distance measures on probabilities--the integral probability metrics (IPMs)--is
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
Properties of classical and quantum Jensen-Shannon divergence
J Briët, P Harremoës - Physical Review A—Atomic, Molecular, and Optical …, 2009 - APS
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most
important divergence measure of information theory, Kullback divergence. As opposed to …
important divergence measure of information theory, Kullback divergence. As opposed to …