Improved contrastive divergence training of energy based models
Contrastive divergence is a popular method of training energy-based models, but is known
to have difficulties with training stability. We propose an adaptation to improve contrastive …
to have difficulties with training stability. We propose an adaptation to improve contrastive …
Vaebm: A symbiosis between variational autoencoders and energy-based models
Energy-based models (EBMs) have recently been successful in representing complex
distributions of small images. However, sampling from them requires expensive Markov …
distributions of small images. However, sampling from them requires expensive Markov …
Learning latent space energy-based prior model
We propose an energy-based model (EBM) in the latent space of a generator model, so that
the EBM serves as a prior model that stands on the top-down network of the generator …
the EBM serves as a prior model that stands on the top-down network of the generator …
On the anatomy of mcmc-based maximum likelihood learning of energy-based models
This study investigates the effects of Markov chain Monte Carlo (MCMC) sampling in
unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family of …
unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family of …
Your GAN is secretly an energy-based model and you should use discriminator driven latent sampling
We show that the sum of the implicit generator log-density $\log p_g $ of a GAN with the logit
score of the discriminator defines an energy function which yields the true data density when …
score of the discriminator defines an energy function which yields the true data density when …
Abstract spatial-temporal reasoning via probabilistic abduction and execution
Spatial-temporal reasoning is a challenging task in Artificial Intelligence (AI) due to its
demanding but unique nature: a theoretic requirement on representing and reasoning …
demanding but unique nature: a theoretic requirement on representing and reasoning …
A contrastive learning approach for training variational autoencoder priors
Variational autoencoders (VAEs) are one of the powerful likelihood-based generative
models with applications in many domains. However, they struggle to generate high-quality …
models with applications in many domains. However, they struggle to generate high-quality …
Learning layout and style reconfigurable gans for controllable image synthesis
With the remarkable recent progress on learning deep generative models, it becomes
increasingly interesting to develop models for controllable image synthesis from …
increasingly interesting to develop models for controllable image synthesis from …
Generative pointnet: Deep energy-based learning on unordered point sets for 3d generation, reconstruction and classification
We propose a generative model of unordered point sets, such as point clouds, in the forms
of an energy-based model, where the energy function is parameterized by an input …
of an energy-based model, where the energy function is parameterized by an input …
A tale of two flows: Cooperative learning of langevin flow and normalizing flow toward energy-based model
This paper studies the cooperative learning of two generative flow models, in which the two
models are iteratively updated based on the jointly synthesized examples. The first flow …
models are iteratively updated based on the jointly synthesized examples. The first flow …