Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Rényi divergence and Kullback-Leibler divergence
T Van Erven, P Harremos - IEEE Transactions on Information …, 2014 - ieeexplore.ieee.org
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is
related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as …
related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as …
Families of alpha-beta-and gamma-divergences: Flexible and robust measures of similarities
A Cichocki, S Amari - Entropy, 2010 - mdpi.com
In this paper, we extend and overview wide families of Alpha-, Beta-and Gamma-
divergences and discuss their fundamental properties. In literature usually only one single …
divergences and discuss their fundamental properties. In literature usually only one single …
Rényi divergence measures for commonly used univariate continuous distributions
Probabilistic 'distances'(also called divergences), which in some sense assess how
'close'two probability distributions are from one another, have been widely employed in …
'close'two probability distributions are from one another, have been widely employed in …
Rényi entropy and free energy
JC Baez - Entropy, 2022 - mdpi.com
The Rényi entropy is a generalization of the usual concept of entropy which depends on a
parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a …
parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a …
Adjusted Rényi entropic value-at-risk
Z Zou, Q Wu, Z Xia, T Hu - European Journal of Operational Research, 2023 - Elsevier
Entropy is a measure of self information or uncertainty. Using different concepts of entropy,
we may get different risk measures by dual representation. In this paper, we introduce and …
we may get different risk measures by dual representation. In this paper, we introduce and …
[HTML][HTML] Entropy as a Tool for the Analysis of Stock Market Efficiency During Periods of Crisis
D Papla, R Siedlecki - Entropy, 2024 - mdpi.com
In the article, we analyse the problem of the efficiency market hypothesis using entropy in
moments of transition from a normal economic situation to crises or slowdowns in European …
moments of transition from a normal economic situation to crises or slowdowns in European …
Rényi divergence and majorization
T van Erven, P Harremoës - 2010 IEEE International …, 2010 - ieeexplore.ieee.org
Rényi divergence is related to Rényi entropy much like information divergence (also called
Kullback-Leibler divergence or relative entropy) is related to Shannon's entropy, and comes …
Kullback-Leibler divergence or relative entropy) is related to Shannon's entropy, and comes …
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information
Rényi-type generalizations of entropy, relative entropy and mutual information have found
numerous applications throughout information theory and beyond. While there is consensus …
numerous applications throughout information theory and beyond. While there is consensus …
On Rényi divergence measures for continuous alphabet sources
M Gil - PhD Thesis, 2011 - library-archives.canada.ca
Abstract<? Pub Inc> The idea of'probabilistic distances'(also called divergences), which in
some sense assess how'close'two probability distributions are from one another, has been …
some sense assess how'close'two probability distributions are from one another, has been …