A langevin-like sampler for discrete distributions
We propose discrete Langevin proposal (DLP), a simple and scalable gradient-based
proposal for sampling complex high-dimensional discrete distributions. In contrast to Gibbs …
proposal for sampling complex high-dimensional discrete distributions. In contrast to Gibbs …
Optimal scaling for locally balanced proposals in discrete spaces
Optimal scaling has been well studied for Metropolis-Hastings (MH) algorithms in
continuous spaces, but a similar understanding has been lacking in discrete spaces …
continuous spaces, but a similar understanding has been lacking in discrete spaces …
The Barker proposal: combining robustness and efficiency in gradient-based MCMC
S Livingstone, G Zanella - … of the Royal Statistical Society Series …, 2022 - academic.oup.com
There is a tension between robustness and efficiency when designing Markov chain Monte
Carlo (MCMC) sampling algorithms. Here we focus on robustness with respect to tuning …
Carlo (MCMC) sampling algorithms. Here we focus on robustness with respect to tuning …
Analysis of stochastic gradient descent in continuous time
J Latz - Statistics and Computing, 2021 - Springer
Stochastic gradient descent is an optimisation method that combines classical gradient
descent with random subsampling within the target functional. In this work, we introduce the …
descent with random subsampling within the target functional. In this work, we introduce the …
A kernel stein test of goodness of fit for sequential models
J Baum, H Kanagawa, A Gretton - … Conference on Machine …, 2023 - proceedings.mlr.press
We propose a goodness-of-fit measure for probability densities modeling observations with
varying dimensionality, such as text documents of differing lengths or variable-length …
varying dimensionality, such as text documents of differing lengths or variable-length …
The reproducing Stein kernel approach for post-hoc corrected sampling
Stein importance sampling is a widely applicable technique based on kernelized Stein
discrepancy, which corrects the output of approximate sampling algorithms by reweighting …
discrepancy, which corrects the output of approximate sampling algorithms by reweighting …
Robust Approximate Sampling via Stochastic Gradient Barker Dynamics
Abstract Stochastic Gradient (SG) Markov Chain Monte Carlo algorithms (MCMC) are
popular algorithms for Bayesian sampling in the presence of large datasets. However, they …
popular algorithms for Bayesian sampling in the presence of large datasets. However, they …
Improving multiple-try Metropolis with local balancing
Multiple-try Metropolis (MTM) is a popular Markov chain Monte Carlo method with the
appealing feature of being amenable to parallel computing. At each iteration, it samples …
appealing feature of being amenable to parallel computing. At each iteration, it samples …
Optimal design of the Barker proposal and other locally balanced Metropolis–Hastings algorithms
We study the class of first-order locally balanced Metropolis–Hastings algorithms introduced
in Livingstone & Zanella (2022). To choose a specific algorithm within the class, the user …
in Livingstone & Zanella (2022). To choose a specific algorithm within the class, the user …
Adaptive random neighbourhood informed Markov chain Monte Carlo for high-dimensional Bayesian variable selection
We introduce a framework for efficient Markov chain Monte Carlo algorithms targeting
discrete-valued high-dimensional distributions, such as posterior distributions in Bayesian …
discrete-valued high-dimensional distributions, such as posterior distributions in Bayesian …