Diffusion models: A comprehensive survey of methods and applications
Diffusion models have emerged as a powerful new family of deep generative models with
record-breaking performance in many applications, including image synthesis, video …
record-breaking performance in many applications, including image synthesis, video …
See through gradients: Image batch recovery via gradinversion
Training deep neural networks requires gradient estimation from data batches to update
parameters. Gradients per parameter are averaged over a set of data and this has been …
parameters. Gradients per parameter are averaged over a set of data and this has been …
Denoising diffusion probabilistic models
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …
of latent variable models inspired by considerations from nonequilibrium thermodynamics …
Spatio-temporal self-supervised representation learning for 3d point clouds
To date, various 3D scene understanding tasks still lack practical and generalizable pre-
trained models, primarily due to the intricate nature of 3D scene understanding tasks and …
trained models, primarily due to the intricate nature of 3D scene understanding tasks and …
How to train your energy-based models
Energy-Based Models (EBMs), also known as non-normalized probabilistic models, specify
probability density or mass functions up to an unknown normalizing constant. Unlike most …
probability density or mass functions up to an unknown normalizing constant. Unlike most …
Implicit generation and modeling with energy based models
Y Du, I Mordatch - Advances in Neural Information …, 2019 - proceedings.neurips.cc
Energy based models (EBMs) are appealing due to their generality and simplicity in
likelihood modeling, but have been traditionally difficult to train. We present techniques to …
likelihood modeling, but have been traditionally difficult to train. We present techniques to …
Learning generative vision transformer with energy-based latent space for saliency prediction
Vision transformer networks have shown superiority in many computer vision tasks. In this
paper, we take a step further by proposing a novel generative vision transformer with latent …
paper, we take a step further by proposing a novel generative vision transformer with latent …
Learning non-convergent non-persistent short-run mcmc toward energy-based model
This paper studies a curious phenomenon in learning energy-based model (EBM) using
MCMC. In each learning iteration, we generate synthesized examples by running a non …
MCMC. In each learning iteration, we generate synthesized examples by running a non …
Improved contrastive divergence training of energy based models
Contrastive divergence is a popular method of training energy-based models, but is known
to have difficulties with training stability. We propose an adaptation to improve contrastive …
to have difficulties with training stability. We propose an adaptation to improve contrastive …
Residual energy-based models for text generation
Text generation is ubiquitous in many NLP tasks, from summarization, to dialogue and
machine translation. The dominant parametric approach is based on locally normalized …
machine translation. The dominant parametric approach is based on locally normalized …