Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …
the distribution of training samples. Research has fragmented into various interconnected …
Normalizing flows for probabilistic modeling and inference
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …
distributions, only requiring the specification of a (usually simple) base distribution and a …
Consistency models
Diffusion models have significantly advanced the fields of image, audio, and video
generation, but they depend on an iterative sampling process that causes slow generation …
generation, but they depend on an iterative sampling process that causes slow generation …
On neural differential equations
P Kidger - arXiv preprint arXiv:2202.02435, 2022 - arxiv.org
The conjoining of dynamical systems and deep learning has become a topic of great
interest. In particular, neural differential equations (NDEs) demonstrate that neural networks …
interest. In particular, neural differential equations (NDEs) demonstrate that neural networks …
Score-based generative modeling through stochastic differential equations
Creating noise from data is easy; creating data from noise is generative modeling. We
present a stochastic differential equation (SDE) that smoothly transforms a complex data …
present a stochastic differential equation (SDE) that smoothly transforms a complex data …
Improved techniques for training consistency models
Y Song, P Dhariwal - arXiv preprint arXiv:2310.14189, 2023 - arxiv.org
Consistency models are a nascent family of generative models that can sample high quality
data in one step without the need for adversarial training. Current consistency models …
data in one step without the need for adversarial training. Current consistency models …
Gmmseg: Gaussian mixture based generative semantic segmentation models
Prevalent semantic segmentation solutions are, in essence, a dense discriminative classifier
of p (class| pixel feature). Though straightforward, this de facto paradigm neglects the …
of p (class| pixel feature). Though straightforward, this de facto paradigm neglects the …
NVAE: A deep hierarchical variational autoencoder
Normalizing flows, autoregressive models, variational autoencoders (VAEs), and deep
energy-based models are among competing likelihood-based frameworks for deep …
energy-based models are among competing likelihood-based frameworks for deep …
Instaflow: One step is enough for high-quality diffusion-based text-to-image generation
Diffusion models have revolutionized text-to-image generation with its exceptional quality
and creativity. However, its multi-step sampling process is known to be slow, often requiring …
and creativity. However, its multi-step sampling process is known to be slow, often requiring …
Score-based generative modeling with critically-damped langevin diffusion
Score-based generative models (SGMs) have demonstrated remarkable synthesis quality.
SGMs rely on a diffusion process that gradually perturbs the data towards a tractable …
SGMs rely on a diffusion process that gradually perturbs the data towards a tractable …