Diffusion models: A comprehensive survey of methods and applications

L Yang, Z Zhang, Y Song, S Hong, R Xu, Y Zhao… - ACM Computing …, 2023 - dl.acm.org
Diffusion models have emerged as a powerful new family of deep generative models with
record-breaking performance in many applications, including image synthesis, video …

Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Denoising diffusion probabilistic models

J Ho, A Jain, P Abbeel - Advances in neural information …, 2020 - proceedings.neurips.cc
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …

How to train your energy-based models

Y Song, DP Kingma - arXiv preprint arXiv:2101.03288, 2021 - arxiv.org
Energy-Based Models (EBMs), also known as non-normalized probabilistic models, specify
probability density or mass functions up to an unknown normalizing constant. Unlike most …

[HTML][HTML] Deep generative modeling for protein design

A Strokach, PM Kim - Current opinion in structural biology, 2022 - Elsevier
Deep learning approaches have produced substantial breakthroughs in fields such as
image classification and natural language processing and are making rapid inroads in the …

Unsupervised learning of compositional energy concepts

Y Du, S Li, Y Sharma, J Tenenbaum… - Advances in Neural …, 2021 - proceedings.neurips.cc
Humans are able to rapidly understand scenes by utilizing concepts extracted from prior
experience. Such concepts are diverse, and include global scene descriptors, such as the …

Improved contrastive divergence training of energy based models

Y Du, S Li, J Tenenbaum, I Mordatch - arXiv preprint arXiv:2012.01316, 2020 - arxiv.org
Contrastive divergence is a popular method of training energy-based models, but is known
to have difficulties with training stability. We propose an adaptation to improve contrastive …

Learning to compose visual relations

N Liu, S Li, Y Du, J Tenenbaum… - Advances in Neural …, 2021 - proceedings.neurips.cc
The visual world around us can be described as a structured set of objects and their
associated relations. An image of a room may be conjured given only the description of the …

Vaebm: A symbiosis between variational autoencoders and energy-based models

Z Xiao, K Kreis, J Kautz, A Vahdat - arXiv preprint arXiv:2010.00654, 2020 - arxiv.org
Energy-based models (EBMs) have recently been successful in representing complex
distributions of small images. However, sampling from them requires expensive Markov …

Learning latent space energy-based prior model

B Pang, T Han, E Nijkamp, SC Zhu… - Advances in Neural …, 2020 - proceedings.neurips.cc
We propose an energy-based model (EBM) in the latent space of a generator model, so that
the EBM serves as a prior model that stands on the top-down network of the generator …