Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
Analysis of langevin monte carlo from poincare to log-sobolev
Classically, the continuous-time Langevin diffusion converges exponentially fast to its
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …
stationary distribution π under the sole assumption that π satisfies a Poincaré inequality …
Faster high-accuracy log-concave sampling via algorithmic warm starts
JM Altschuler, S Chewi - Journal of the ACM, 2024 - dl.acm.org
It is a fundamental problem to understand the complexity of high-accuracy sampling from a
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
Minimax mixing time of the Metropolis-adjusted Langevin algorithm for log-concave sampling
K Wu, S Schmidler, Y Chen - Journal of Machine Learning Research, 2022 - jmlr.org
We study the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) for
sampling from a log-smooth and strongly log-concave distribution. We establish its optimal …
sampling from a log-smooth and strongly log-concave distribution. We establish its optimal …
Sampling from the mean-field stationary distribution
We study the complexity of sampling from the stationary distribution of a mean-field SDE, or
equivalently, the complexity of minimizing a functional over the space of probability …
equivalently, the complexity of minimizing a functional over the space of probability …
Efficient constrained sampling via the mirror-Langevin algorithm
We propose a new discretization of the mirror-Langevin diffusion and give a crisp proof of its
convergence. Our analysis uses relative convexity/smoothness and self-concordance, ideas …
convergence. Our analysis uses relative convexity/smoothness and self-concordance, ideas …
Towards a complete analysis of langevin monte carlo: Beyond poincaré inequality
Langevin diffusions are rapidly convergent under appropriate functional inequality
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …
Improved dimension dependence of a proximal algorithm for sampling
We propose a sampling algorithm that achieves superior complexity bounds in all the
classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI) …
classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI) …
Quantum algorithms for sampling log-concave distributions and estimating normalizing constants
Given a convex function $ f\colon\mathbb {R}^{d}\to\mathbb {R} $, the problem of sampling
from a distribution $\propto e^{-f (x)} $ is called log-concave sampling. This task has wide …
from a distribution $\propto e^{-f (x)} $ is called log-concave sampling. This task has wide …
Particle guidance: non-iid diverse sampling with diffusion models
In light of the widespread success of generative models, a significant amount of research
has gone into speeding up their sampling time. However, generative models are often …
has gone into speeding up their sampling time. However, generative models are often …