An invitation to sequential Monte Carlo samplers
ABSTRACT Statisticians often use Monte Carlo methods to approximate probability
distributions, primarily with Markov chain Monte Carlo and importance sampling. Sequential …
distributions, primarily with Markov chain Monte Carlo and importance sampling. Sequential …
Turing: a language for flexible probabilistic inference
Probabilistic programming promises to simplify and democratize probabilistic machine
learning, but successful probabilistic programming systems require flexible, generic and …
learning, but successful probabilistic programming systems require flexible, generic and …
Inference compilation and universal probabilistic programming
We introduce a method for using deep neural networks to amortize the cost of inference in
models from the family induced by universal probabilistic programming languages …
models from the family induced by universal probabilistic programming languages …
Elements of sequential monte carlo
CA Naesseth, F Lindsten… - Foundations and Trends …, 2019 - nowpublishers.com
A core problem in statistics and probabilistic machine learning is to compute probability
distributions and expectations. This is the fundamental problem of Bayesian statistics and …
distributions and expectations. This is the fundamental problem of Bayesian statistics and …
Probabilistic programs as an action description language
RI Brafman, D Tolpin, O Wertheim - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Actions description languages (ADLs), such as STRIPS, PDDL, and RDDL specify the input
format for planning algorithms. Unfortunately, their syntax is familiar to planning experts only …
format for planning algorithms. Unfortunately, their syntax is familiar to planning experts only …
Design and implementation of probabilistic programming language anglican
Anglican is a probabilistic programming system designed to interoperate with Clojure and
other JVM languages. We introduce the programming language Anglican, outline our design …
other JVM languages. We introduce the programming language Anglican, outline our design …
Nested variational inference
We develop nested variational inference (NVI), a family of methods that learn proposals for
nested importance samplers by minimizing an forward or reverse KL divergence at each …
nested importance samplers by minimizing an forward or reverse KL divergence at each …
Automating inference, learning, and design using probabilistic programming
T Rainforth - 2017 - ora.ox.ac.uk
Imagine a world where computational simulations can be inverted as easily as running them
forwards, where data can be used to refine models automatically, and where the only …
forwards, where data can be used to refine models automatically, and where the only …
Interacting contour stochastic gradient Langevin dynamics
We propose an interacting contour stochastic gradient Langevin dynamics (ICSGLD)
sampler, an embarrassingly parallel multiple-chain contour stochastic gradient Langevin …
sampler, an embarrassingly parallel multiple-chain contour stochastic gradient Langevin …
Rethinking variational inference for probabilistic programs with stochastic support
Abstract We introduce Support Decomposition Variational Inference (SDVI), a new
variational inference (VI) approach for probabilistic programs with stochastic support …
variational inference (VI) approach for probabilistic programs with stochastic support …