Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions

S Chen, S Chewi, J Li, Y Li, A Salim… - arXiv preprint arXiv …, 2022 - arxiv.org
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …

Analysis of langevin monte carlo from poincare to log-sobolev

S Chewi, MA Erdogdu, M Li, R Shen… - Foundations of …, 2024 - Springer
Classically, the continuous-time Langevin diffusion converges exponentially fast to its
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …

Faster high-accuracy log-concave sampling via algorithmic warm starts

JM Altschuler, S Chewi - Journal of the ACM, 2024 - dl.acm.org
It is a fundamental problem to understand the complexity of high-accuracy sampling from a
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …

Minimax mixing time of the Metropolis-adjusted Langevin algorithm for log-concave sampling

K Wu, S Schmidler, Y Chen - Journal of Machine Learning Research, 2022 - jmlr.org
We study the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) for
sampling from a log-smooth and strongly log-concave distribution. We establish its optimal …

Sampling from the mean-field stationary distribution

Y Kook, MS Zhang, S Chewi… - The Thirty Seventh …, 2024 - proceedings.mlr.press
We study the complexity of sampling from the stationary distribution of a mean-field SDE, or
equivalently, the complexity of minimizing a functional over the space of probability …

Efficient constrained sampling via the mirror-Langevin algorithm

K Ahn, S Chewi - Advances in Neural Information …, 2021 - proceedings.neurips.cc
We propose a new discretization of the mirror-Langevin diffusion and give a crisp proof of its
convergence. Our analysis uses relative convexity/smoothness and self-concordance, ideas …

Towards a complete analysis of langevin monte carlo: Beyond poincaré inequality

A Mousavi-Hosseini, TK Farghly, Y He… - The Thirty Sixth …, 2023 - proceedings.mlr.press
Langevin diffusions are rapidly convergent under appropriate functional inequality
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …

Improved dimension dependence of a proximal algorithm for sampling

J Fan, B Yuan, Y Chen - The Thirty Sixth Annual Conference …, 2023 - proceedings.mlr.press
We propose a sampling algorithm that achieves superior complexity bounds in all the
classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI) …

Quantum algorithms for sampling log-concave distributions and estimating normalizing constants

AM Childs, T Li, JP Liu, C Wang… - Advances in Neural …, 2022 - proceedings.neurips.cc
Given a convex function $ f\colon\mathbb {R}^{d}\to\mathbb {R} $, the problem of sampling
from a distribution $\propto e^{-f (x)} $ is called log-concave sampling. This task has wide …

Particle guidance: non-iid diverse sampling with diffusion models

G Corso, Y Xu, V De Bortoli, R Barzilay… - arXiv preprint arXiv …, 2023 - arxiv.org
In light of the widespread success of generative models, a significant amount of research
has gone into speeding up their sampling time. However, generative models are often …