Mixing of Hamiltonian Monte Carlo on strongly log-concave distributions: Continuous dynamics
O Mangoubi, A Smith - The Annals of Applied Probability, 2021 - projecteuclid.org
We obtain several quantitative bounds on the mixing properties of an “ideal” Hamiltonian
Monte Carlo (HMC) Markov chain for a strongly log-concave target distribution π on R d. Our …
Monte Carlo (HMC) Markov chain for a strongly log-concave target distribution π on R d. Our …
Global convergence of stochastic gradient hamiltonian monte carlo for nonconvex stochastic optimization: Nonasymptotic performance bounds and momentum-based …
Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of stochastic gradients
with momentum where a controlled and properly scaled Gaussian noise is added to the …
with momentum where a controlled and properly scaled Gaussian noise is added to the …
Stacking for non-mixing Bayesian computations: The curse and blessing of multimodal posteriors
When working with multimodal Bayesian posterior distributions, Markov chain Monte Carlo
(MCMC) algorithms have difficulty moving between modes, and default variational or mode …
(MCMC) algorithms have difficulty moving between modes, and default variational or mode …
Emerging Directions in Bayesian Computation
Bayesian models are powerful tools for studying complex data, allowing the analyst to
encode rich hierarchical dependencies and leverage prior information. Most importantly …
encode rich hierarchical dependencies and leverage prior information. Most importantly …
[HTML][HTML] Adaptive weighting of Bayesian physics informed neural networks for multitask and multiscale forward and inverse problems
In this paper, we present a novel methodology for automatic adaptive weighting of Bayesian
Physics-Informed Neural Networks (BPINNs), and we demonstrate that this makes it …
Physics-Informed Neural Networks (BPINNs), and we demonstrate that this makes it …
The Hastings algorithm at fifty
DB Dunson, JE Johndrow - Biometrika, 2020 - academic.oup.com
In a 1970 Biometrika paper, WK Hastings developed a broad class of Markov chain
algorithms for sampling from probability distributions that are difficult to sample from directly …
algorithms for sampling from probability distributions that are difficult to sample from directly …
[PDF][PDF] Schrödinger-Föllmer sampler: sampling without ergodicity
Sampling from probability distributions is an important problem in statistics and machine
learning, specially in Bayesian inference when integration with respect to posterior …
learning, specially in Bayesian inference when integration with respect to posterior …
Cauchy Markov random field priors for Bayesian inversion
J Suuronen, NK Chada, L Roininen - Statistics and computing, 2022 - Springer
Abstract The use of Cauchy Markov random field priors in statistical inverse problems can
potentially lead to posterior distributions which are non-Gaussian, high-dimensional …
potentially lead to posterior distributions which are non-Gaussian, high-dimensional …
Parallel MCMC algorithms: theoretical foundations, algorithm design, case studies
Abstract Parallel Markov Chain Monte Carlo (pMCMC) algorithms generate clouds of
proposals at each step to efficiently resolve a target probability distribution. We build a …
proposals at each step to efficiently resolve a target probability distribution. We build a …
Adaptive tuning of hamiltonian monte carlo within sequential monte carlo
Abstract Sequential Monte Carlo (SMC) samplers are an alternative to MCMC for Bayesian
computation. However, their performance depends strongly on the Markov kernels used to …
computation. However, their performance depends strongly on the Markov kernels used to …