An optimization-centric view on Bayes' rule: Reviewing and generalizing variational inference
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the
representation of Bayes' rule as infinite-dimensional optimization (Csisz´ r, 1975; Donsker …
representation of Bayes' rule as infinite-dimensional optimization (Csisz´ r, 1975; Donsker …
Don't blame the elbo! a linear vae perspective on posterior collapse
Abstract Posterior collapse in Variational Autoencoders (VAEs) with uninformative priors
arises when the variational posterior distribution closely matches the prior for a subset of …
arises when the variational posterior distribution closely matches the prior for a subset of …
Differentiable samplers for deep latent variable models
Latent variable models are a popular class of models in statistics. Combined with neural
networks to improve their expressivity, the resulting deep latent variable models have also …
networks to improve their expressivity, the resulting deep latent variable models have also …
Understanding posterior collapse in generative latent variable models
Posterior collapse in Variational Autoencoders (VAEs) arises when the variational
distribution closely matches the uninformative prior for a subset of latent variables. This …
distribution closely matches the uninformative prior for a subset of latent variables. This …
Score-based diffusion meets annealed importance sampling
More than twenty years after its introduction, Annealed Importance Sampling (AIS) remains
one of the most effective methods for marginal likelihood estimation. It relies on a sequence …
one of the most effective methods for marginal likelihood estimation. It relies on a sequence …
Annealed flow transport monte carlo
Abstract Annealed Importance Sampling (AIS) and its Sequential Monte Carlo (SMC)
extensions are state-of-the-art methods for estimating normalizing constants of probability …
extensions are state-of-the-art methods for estimating normalizing constants of probability …
The usual suspects? Reassessing blame for VAE posterior collapse
In narrow asymptotic settings Gaussian VAE models of continuous data have been shown to
possess global optima aligned with ground-truth distributions. Even so, it is well known that …
possess global optima aligned with ground-truth distributions. Even so, it is well known that …
Generalized variational inference: Three arguments for deriving new posteriors
We advocate an optimization-centric view on and introduce a novel generalization of
Bayesian inference. Our inspiration is the representation of Bayes' rule as infinite …
Bayesian inference. Our inspiration is the representation of Bayes' rule as infinite …
Monte Carlo variational auto-encoders
Variational auto-encoders (VAE) are popular deep latent variable models which are trained
by maximizing an Evidence Lower Bound (ELBO). To obtain tighter ELBO and hence better …
by maximizing an Evidence Lower Bound (ELBO). To obtain tighter ELBO and hence better …
Posterior collapse of a linear latent variable model
This work identifies the existence and cause of a type of posterior collapse that frequently
occurs in the Bayesian deep learning practice. For a general linear latent variable model …
occurs in the Bayesian deep learning practice. For a general linear latent variable model …