Mixing of Hamiltonian Monte Carlo on strongly log-concave distributions: Continuous dynamics

O Mangoubi, A Smith - The Annals of Applied Probability, 2021 - projecteuclid.org
We obtain several quantitative bounds on the mixing properties of an “ideal” Hamiltonian
Monte Carlo (HMC) Markov chain for a strongly log-concave target distribution π on R d. Our …

Global convergence of stochastic gradient hamiltonian monte carlo for nonconvex stochastic optimization: Nonasymptotic performance bounds and momentum-based …

X Gao, M Gürbüzbalaban, L Zhu - Operations Research, 2022 - pubsonline.informs.org
Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of stochastic gradients
with momentum where a controlled and properly scaled Gaussian noise is added to the …

Stacking for non-mixing Bayesian computations: The curse and blessing of multimodal posteriors

Y Yao, A Vehtari, A Gelman - Journal of Machine Learning Research, 2022 - jmlr.org
When working with multimodal Bayesian posterior distributions, Markov chain Monte Carlo
(MCMC) algorithms have difficulty moving between modes, and default variational or mode …

Emerging Directions in Bayesian Computation

S Winter, T Campbell, L Lin, S Srivastava… - Statistical …, 2024 - projecteuclid.org
Bayesian models are powerful tools for studying complex data, allowing the analyst to
encode rich hierarchical dependencies and leverage prior information. Most importantly …

[HTML][HTML] Adaptive weighting of Bayesian physics informed neural networks for multitask and multiscale forward and inverse problems

S Perez, S Maddu, IF Sbalzarini, P Poncet - Journal of Computational …, 2023 - Elsevier
In this paper, we present a novel methodology for automatic adaptive weighting of Bayesian
Physics-Informed Neural Networks (BPINNs), and we demonstrate that this makes it …

The Hastings algorithm at fifty

DB Dunson, JE Johndrow - Biometrika, 2020 - academic.oup.com
In a 1970 Biometrika paper, WK Hastings developed a broad class of Markov chain
algorithms for sampling from probability distributions that are difficult to sample from directly …

[PDF][PDF] Schrödinger-Föllmer sampler: sampling without ergodicity

J Huang, Y Jiao, L Kang, X Liao, J Liu… - arXiv preprint arXiv …, 2021 - researchgate.net
Sampling from probability distributions is an important problem in statistics and machine
learning, specially in Bayesian inference when integration with respect to posterior …

Cauchy Markov random field priors for Bayesian inversion

J Suuronen, NK Chada, L Roininen - Statistics and computing, 2022 - Springer
Abstract The use of Cauchy Markov random field priors in statistical inverse problems can
potentially lead to posterior distributions which are non-Gaussian, high-dimensional …

Parallel MCMC algorithms: theoretical foundations, algorithm design, case studies

NE Glatt-Holtz, AJ Holbrook, JA Krometis… - … of Mathematics and …, 2024 - academic.oup.com
Abstract Parallel Markov Chain Monte Carlo (pMCMC) algorithms generate clouds of
proposals at each step to efficiently resolve a target probability distribution. We build a …

Adaptive tuning of hamiltonian monte carlo within sequential monte carlo

A Buchholz, N Chopin, PE Jacob - Bayesian Analysis, 2021 - projecteuclid.org
Abstract Sequential Monte Carlo (SMC) samplers are an alternative to MCMC for Bayesian
computation. However, their performance depends strongly on the Markov kernels used to …