Diffusion models: A comprehensive survey of methods and applications

L Yang, Z Zhang, Y Song, S Hong, R Xu, Y Zhao… - ACM Computing …, 2023 - dl.acm.org
Diffusion models have emerged as a powerful new family of deep generative models with
record-breaking performance in many applications, including image synthesis, video …

See through gradients: Image batch recovery via gradinversion

H Yin, A Mallya, A Vahdat, JM Alvarez… - Proceedings of the …, 2021 - openaccess.thecvf.com
Training deep neural networks requires gradient estimation from data batches to update
parameters. Gradients per parameter are averaged over a set of data and this has been …

Denoising diffusion probabilistic models

J Ho, A Jain, P Abbeel - Advances in neural information …, 2020 - proceedings.neurips.cc
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …

Spatio-temporal self-supervised representation learning for 3d point clouds

S Huang, Y Xie, SC Zhu, Y Zhu - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
To date, various 3D scene understanding tasks still lack practical and generalizable pre-
trained models, primarily due to the intricate nature of 3D scene understanding tasks and …

How to train your energy-based models

Y Song, DP Kingma - arXiv preprint arXiv:2101.03288, 2021 - arxiv.org
Energy-Based Models (EBMs), also known as non-normalized probabilistic models, specify
probability density or mass functions up to an unknown normalizing constant. Unlike most …

Implicit generation and modeling with energy based models

Y Du, I Mordatch - Advances in Neural Information …, 2019 - proceedings.neurips.cc
Energy based models (EBMs) are appealing due to their generality and simplicity in
likelihood modeling, but have been traditionally difficult to train. We present techniques to …

Learning generative vision transformer with energy-based latent space for saliency prediction

J Zhang, J Xie, N Barnes, P Li - Advances in Neural …, 2021 - proceedings.neurips.cc
Vision transformer networks have shown superiority in many computer vision tasks. In this
paper, we take a step further by proposing a novel generative vision transformer with latent …

Learning non-convergent non-persistent short-run mcmc toward energy-based model

E Nijkamp, M Hill, SC Zhu… - Advances in Neural …, 2019 - proceedings.neurips.cc
This paper studies a curious phenomenon in learning energy-based model (EBM) using
MCMC. In each learning iteration, we generate synthesized examples by running a non …

Improved contrastive divergence training of energy based models

Y Du, S Li, J Tenenbaum, I Mordatch - arXiv preprint arXiv:2012.01316, 2020 - arxiv.org
Contrastive divergence is a popular method of training energy-based models, but is known
to have difficulties with training stability. We propose an adaptation to improve contrastive …

Residual energy-based models for text generation

Y Deng, A Bakhtin, M Ott, A Szlam… - arXiv preprint arXiv …, 2020 - arxiv.org
Text generation is ubiquitous in many NLP tasks, from summarization, to dialogue and
machine translation. The dominant parametric approach is based on locally normalized …