An invitation to sequential Monte Carlo samplers

C Dai, J Heng, PE Jacob, N Whiteley - Journal of the American …, 2022 - Taylor & Francis
ABSTRACT Statisticians often use Monte Carlo methods to approximate probability
distributions, primarily with Markov chain Monte Carlo and importance sampling. Sequential …

Turing: a language for flexible probabilistic inference

H Ge, K Xu, Z Ghahramani - International conference on …, 2018 - proceedings.mlr.press
Probabilistic programming promises to simplify and democratize probabilistic machine
learning, but successful probabilistic programming systems require flexible, generic and …

Inference compilation and universal probabilistic programming

TA Le, AG Baydin, F Wood - Artificial Intelligence and …, 2017 - proceedings.mlr.press
We introduce a method for using deep neural networks to amortize the cost of inference in
models from the family induced by universal probabilistic programming languages …

Elements of sequential monte carlo

CA Naesseth, F Lindsten… - Foundations and Trends …, 2019 - nowpublishers.com
A core problem in statistics and probabilistic machine learning is to compute probability
distributions and expectations. This is the fundamental problem of Bayesian statistics and …

Probabilistic programs as an action description language

RI Brafman, D Tolpin, O Wertheim - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Actions description languages (ADLs), such as STRIPS, PDDL, and RDDL specify the input
format for planning algorithms. Unfortunately, their syntax is familiar to planning experts only …

Design and implementation of probabilistic programming language anglican

D Tolpin, JW van de Meent, H Yang… - Proceedings of the 28th …, 2016 - dl.acm.org
Anglican is a probabilistic programming system designed to interoperate with Clojure and
other JVM languages. We introduce the programming language Anglican, outline our design …

Nested variational inference

H Zimmermann, H Wu, B Esmaeili… - Advances in Neural …, 2021 - proceedings.neurips.cc
We develop nested variational inference (NVI), a family of methods that learn proposals for
nested importance samplers by minimizing an forward or reverse KL divergence at each …

Automating inference, learning, and design using probabilistic programming

T Rainforth - 2017 - ora.ox.ac.uk
Imagine a world where computational simulations can be inverted as easily as running them
forwards, where data can be used to refine models automatically, and where the only …

Interacting contour stochastic gradient Langevin dynamics

W Deng, S Liang, B Hao, G Lin, F Liang - arXiv preprint arXiv:2202.09867, 2022 - arxiv.org
We propose an interacting contour stochastic gradient Langevin dynamics (ICSGLD)
sampler, an embarrassingly parallel multiple-chain contour stochastic gradient Langevin …

Rethinking variational inference for probabilistic programs with stochastic support

T Reichelt, L Ong, T Rainforth - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract We introduce Support Decomposition Variational Inference (SDVI), a new
variational inference (VI) approach for probabilistic programs with stochastic support …