Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …
the distribution of training samples. Research has fragmented into various interconnected …
Advances in machine-learning-based sampling motivated by lattice quantum chromodynamics
Sampling from known probability distributions is a ubiquitous task in computational science,
underlying calculations in domains from linguistics to biology and physics. Generative …
underlying calculations in domains from linguistics to biology and physics. Generative …
Vaebm: A symbiosis between variational autoencoders and energy-based models
Energy-based models (EBMs) have recently been successful in representing complex
distributions of small images. However, sampling from them requires expensive Markov …
distributions of small images. However, sampling from them requires expensive Markov …
Generative pointnet: Deep energy-based learning on unordered point sets for 3d generation, reconstruction and classification
We propose a generative model of unordered point sets, such as point clouds, in the forms
of an energy-based model, where the energy function is parameterized by an input …
of an energy-based model, where the energy function is parameterized by an input …
A tale of two flows: Cooperative learning of langevin flow and normalizing flow toward energy-based model
This paper studies the cooperative learning of two generative flow models, in which the two
models are iteratively updated based on the jointly synthesized examples. The first flow …
models are iteratively updated based on the jointly synthesized examples. The first flow …
Structured multi-task learning for molecular property prediction
Multi-task learning for molecular property prediction is becoming increasingly important in
drug discovery. However, in contrast to other domains, the performance of multi-task …
drug discovery. However, in contrast to other domains, the performance of multi-task …
Composing normalizing flows for inverse problems
Given an inverse problem with a normalizing flow prior, we wish to estimate the distribution
of the underlying signal conditioned on the observations. We approach this problem as a …
of the underlying signal conditioned on the observations. We approach this problem as a …
[HTML][HTML] Variational Bayesian inference with complex geostatistical priors using inverse autoregressive flows
We combine inverse autoregressive flows (IAF) and variational Bayesian inference
(variational Bayes) in the context of geophysical inversion parameterized with deep …
(variational Bayes) in the context of geophysical inversion parameterized with deep …
Learning energy-based models by cooperative diffusion recovery likelihood
Training energy-based models (EBMs) with maximum likelihood estimation on high-
dimensional data can be both challenging and time-consuming. As a result, there a …
dimensional data can be both challenging and time-consuming. As a result, there a …
Bi-level doubly variational learning for energy-based latent variable models
Energy-based latent variable models (EBLVMs) are more expressive than conventional
energy-based models. However, its potential on visual tasks are limited by its training …
energy-based models. However, its potential on visual tasks are limited by its training …