Analysis of langevin monte carlo from poincare to log-sobolev
Classically, the continuous-time Langevin diffusion converges exponentially fast to its
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …
Non-convex learning via stochastic gradient langevin dynamics: a nonasymptotic analysis
M Raginsky, A Rakhlin… - Conference on Learning …, 2017 - proceedings.mlr.press
Abstract Stochastic Gradient Langevin Dynamics (SGLD) is a popular variant of Stochastic
Gradient Descent, where properly scaled isotropic Gaussian noise is added to an unbiased …
Gradient Descent, where properly scaled isotropic Gaussian noise is added to an unbiased …
Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Mean-field Langevin dynamics: Time-space discretization, stochastic gradient, and variance reduction
The mean-field Langevin dynamics (MFLD) is a nonlinear generalization of the Langevin
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …
Differentially private learning needs hidden state (or much faster convergence)
Prior work on differential privacy analysis of randomized SGD algorithms relies on
composition theorems, where the implicit (unrealistic) assumption is that the internal state of …
composition theorems, where the implicit (unrealistic) assumption is that the internal state of …
Dimension-free log-Sobolev inequalities for mixture distributions
We prove that if (P x) x∈ X is a family of probability measures which satisfy the log-Sobolev
inequality and whose pairwise chi-squared divergences are uniformly bounded, and μ is …
inequality and whose pairwise chi-squared divergences are uniformly bounded, and μ is …
[PDF][PDF] Improved convergence of score-based diffusion models via prediction-correction
Score-based generative models (SGMs) are powerful tools to sample from complex data
distributions. Their underlying idea is to (i) run a forward process for time T1 by adding noise …
distributions. Their underlying idea is to (i) run a forward process for time T1 by adding noise …
Convergence of mean-field Langevin dynamics: Time and space discretization, stochastic gradient, and variance reduction
The mean-field Langevin dynamics (MFLD) is a nonlinear generalization of the Langevin
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …
Existence of Stein kernels under a spectral gap, and discrepancy bounds
TA Courtade, M Fathi, A Pananjady - 2019 - projecteuclid.org
We establish existence of Stein kernels for probability measures on R^d satisfying a
Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures …
Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures …
Symmetric mean-field langevin dynamics for distributional minimax problems
In this paper, we extend mean-field Langevin dynamics to minimax optimization over
probability distributions for the first time with symmetric and provably convergent updates …
probability distributions for the first time with symmetric and provably convergent updates …