Analysis of langevin monte carlo from poincare to log-sobolev

S Chewi, MA Erdogdu, M Li, R Shen… - Foundations of …, 2024 - Springer
Classically, the continuous-time Langevin diffusion converges exponentially fast to its
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …

Non-convex learning via stochastic gradient langevin dynamics: a nonasymptotic analysis

M Raginsky, A Rakhlin… - Conference on Learning …, 2017 - proceedings.mlr.press
Abstract Stochastic Gradient Langevin Dynamics (SGLD) is a popular variant of Stochastic
Gradient Descent, where properly scaled isotropic Gaussian noise is added to an unbiased …

Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices

S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …

Mean-field Langevin dynamics: Time-space discretization, stochastic gradient, and variance reduction

T Suzuki, D Wu, A Nitanda - Advances in Neural …, 2024 - proceedings.neurips.cc
The mean-field Langevin dynamics (MFLD) is a nonlinear generalization of the Langevin
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …

Differentially private learning needs hidden state (or much faster convergence)

J Ye, R Shokri - Advances in Neural Information Processing …, 2022 - proceedings.neurips.cc
Prior work on differential privacy analysis of randomized SGD algorithms relies on
composition theorems, where the implicit (unrealistic) assumption is that the internal state of …

Dimension-free log-Sobolev inequalities for mixture distributions

HB Chen, S Chewi, J Niles-Weed - Journal of Functional Analysis, 2021 - Elsevier
We prove that if (P x) x∈ X is a family of probability measures which satisfy the log-Sobolev
inequality and whose pairwise chi-squared divergences are uniformly bounded, and μ is …

[PDF][PDF] Improved convergence of score-based diffusion models via prediction-correction

F Pedrotti, J Maas, M Mondelli - arXiv preprint arXiv:2305.14164, 2023 - researchgate.net
Score-based generative models (SGMs) are powerful tools to sample from complex data
distributions. Their underlying idea is to (i) run a forward process for time T1 by adding noise …

Convergence of mean-field Langevin dynamics: Time and space discretization, stochastic gradient, and variance reduction

T Suzuki, D Wu, A Nitanda - arXiv preprint arXiv:2306.07221, 2023 - arxiv.org
The mean-field Langevin dynamics (MFLD) is a nonlinear generalization of the Langevin
dynamics that incorporates a distribution-dependent drift, and it naturally arises from the …

Existence of Stein kernels under a spectral gap, and discrepancy bounds

TA Courtade, M Fathi, A Pananjady - 2019 - projecteuclid.org
We establish existence of Stein kernels for probability measures on R^d satisfying a
Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures …

Symmetric mean-field langevin dynamics for distributional minimax problems

J Kim, K Yamamoto, K Oko, Z Yang… - arXiv preprint arXiv …, 2023 - arxiv.org
In this paper, we extend mean-field Langevin dynamics to minimax optimization over
probability distributions for the first time with symmetric and provably convergent updates …