Stein variational gradient descent: A general purpose bayesian inference algorithm
We propose a general purpose variational inference algorithm that forms a natural
counterpart of gradient descent for optimization. Our method iteratively transports a set of …
counterpart of gradient descent for optimization. Our method iteratively transports a set of …
Scaling Hamiltonian Monte Carlo inference for Bayesian neural networks with symmetric splitting
Abstract Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) approach
that exhibits favourable exploration properties in high-dimensional models such as neural …
that exhibits favourable exploration properties in high-dimensional models such as neural …
It takes (only) two: Adversarial generator-encoder networks
We present a new autoencoder-type architecture that is trainable in an unsupervised mode,
sustains both generation and inference, and has the quality of conditional and unconditional …
sustains both generation and inference, and has the quality of conditional and unconditional …
Neutra-lizing bad geometry in hamiltonian monte carlo using neural transport
M Hoffman, P Sountsov, JV Dillon, I Langmore… - arXiv preprint arXiv …, 2019 - arxiv.org
Hamiltonian Monte Carlo is a powerful algorithm for sampling from difficult-to-normalize
posterior distributions. However, when the geometry of the posterior is unfavorable, it may …
posterior distributions. However, when the geometry of the posterior is unfavorable, it may …
Inference via low-dimensional couplings
We investigate the low-dimensional structure of deterministic transformations between
random variables, ie, transport maps between probability measures. In the context of …
random variables, ie, transport maps between probability measures. In the context of …
Coupling techniques for nonlinear ensemble filtering
We consider filtering in high-dimensional non-Gaussian state-space models with intractable
transition kernels, nonlinear and possibly chaotic dynamics, and sparse observations in …
transition kernels, nonlinear and possibly chaotic dynamics, and sparse observations in …
Structured neural networks for density estimation and causal inference
Injecting structure into neural networks enables learning functions that satisfy invariances
with respect to subsets of inputs. For instance, when learning generative models using …
with respect to subsets of inputs. For instance, when learning generative models using …
Probability flow solution of the fokker–planck equation
NM Boffi, E Vanden-Eijnden - Machine Learning: Science and …, 2023 - iopscience.iop.org
The method of choice for integrating the time-dependent Fokker–Planck equation (FPE) in
high-dimension is to generate samples from the solution via integration of the associated …
high-dimension is to generate samples from the solution via integration of the associated …
Seismic tomography using variational inference methods
Seismic tomography is a methodology to image the interior of solid or fluid media and is
often used to map properties in the subsurface of the Earth. In order to better interpret the …
often used to map properties in the subsurface of the Earth. In order to better interpret the …
[HTML][HTML] Sequential Monte Carlo with kernel embedded mappings: The mapping particle filter
M Pulido, PJ van Leeuwen - Journal of Computational Physics, 2019 - Elsevier
In this work, a novel sequential Monte Carlo filter is introduced which aims at an efficient
sampling of the state space. Particles are pushed forward from the prediction to the posterior …
sampling of the state space. Particles are pushed forward from the prediction to the posterior …