Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Energy-based out-of-distribution detection

W Liu, X Wang, J Owens, Y Li - Advances in neural …, 2020 - proceedings.neurips.cc
Determining whether inputs are out-of-distribution (OOD) is an essential building block for
safely deploying machine learning models in the open world. However, previous methods …

Implicit generation and modeling with energy based models

Y Du, I Mordatch - Advances in Neural Information …, 2019 - proceedings.neurips.cc
Energy based models (EBMs) are appealing due to their generality and simplicity in
likelihood modeling, but have been traditionally difficult to train. We present techniques to …

Plug & play generative networks: Conditional iterative generation of images in latent space

A Nguyen, J Clune, Y Bengio… - Proceedings of the …, 2017 - openaccess.thecvf.com
Generating high-resolution, photo-realistic images has been a long-standing goal in
machine learning. Recently, Nguyen et al. 2016 showed one interesting way to synthesize …

Learning generative vision transformer with energy-based latent space for saliency prediction

J Zhang, J Xie, N Barnes, P Li - Advances in Neural …, 2021 - proceedings.neurips.cc
Vision transformer networks have shown superiority in many computer vision tasks. In this
paper, we take a step further by proposing a novel generative vision transformer with latent …

Object representations as fixed points: Training iterative refinement algorithms with implicit differentiation

M Chang, T Griffiths, S Levine - Advances in Neural …, 2022 - proceedings.neurips.cc
Current work in object-centric learning has been motivated by developing learning
algorithms that infer independent and symmetric entities from the perceptual input. This often …

[HTML][HTML] Una revisión sistemática sobre aula invertida y aprendizaje colaborativo apoyados en inteligencia artificial para el aprendizaje de programación

CG Hidalgo Suárez, JM Llanos Mosquera… - Tecnura, 2021 - scielo.org.co
Objetivo: La implementación de estrategias pedagógicas como el aula invertida (AI) y el
aprendizaje colaborativo (AC) han contribuido en la enseñanza de programación de …

Learning non-convergent non-persistent short-run mcmc toward energy-based model

E Nijkamp, M Hill, SC Zhu… - Advances in Neural …, 2019 - proceedings.neurips.cc
This paper studies a curious phenomenon in learning energy-based model (EBM) using
MCMC. In each learning iteration, we generate synthesized examples by running a non …

On the anatomy of mcmc-based maximum likelihood learning of energy-based models

E Nijkamp, M Hill, T Han, SC Zhu, YN Wu - Proceedings of the AAAI …, 2020 - ojs.aaai.org
This study investigates the effects of Markov chain Monte Carlo (MCMC) sampling in
unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family of …

Generalized energy based models

M Arbel, L Zhou, A Gretton - arXiv preprint arXiv:2003.05033, 2020 - arxiv.org
We introduce the Generalized Energy Based Model (GEBM) for generative modelling. These
models combine two trained components: a base distribution (generally an implicit model) …