Bayesian computation: a summary of the current state, and samples backwards and forwards
Recent decades have seen enormous improvements in computational inference for
statistical models; there have been competitive continual enhancements in a wide range of …
statistical models; there have been competitive continual enhancements in a wide range of …
Sparse, adaptive Smolyak quadratures for Bayesian inverse problems
C Schillings, C Schwab - Inverse Problems, 2013 - iopscience.iop.org
Based on the parametric deterministic formulation of Bayesian inverse problems with
unknown input parameter from infinite-dimensional, separable Banach spaces proposed in …
unknown input parameter from infinite-dimensional, separable Banach spaces proposed in …
Sparse deterministic approximation of Bayesian inverse problems
We present a parametric deterministic formulation of Bayesian inverse problems with an
input parameter from infinite-dimensional, separable Banach spaces. In this formulation, the …
input parameter from infinite-dimensional, separable Banach spaces. In this formulation, the …
Spherical Hamiltonian Monte Carlo for constrained target distributions
S Lan, B Zhou, B Shahbaba - International Conference on …, 2014 - proceedings.mlr.press
Statistical models with constrained probability distributions are abundant in machine
learning. Some examples include regression models with norm constraints (eg, Lasso) …
learning. Some examples include regression models with norm constraints (eg, Lasso) …
In search of lost mixing time: adaptive Markov chain Monte Carlo schemes for Bayesian variable selection with very large p
The availability of datasets with large numbers of variables is rapidly increasing. The
effective application of Bayesian variable selection methods for regression with these …
effective application of Bayesian variable selection methods for regression with these …
Sampling constrained continuous probability distributions: A review
The problem of sampling constrained continuous distributions has frequently appeared in
many machine/statistical learning models. Many Markov Chain Monte Carlo (MCMC) …
many machine/statistical learning models. Many Markov Chain Monte Carlo (MCMC) …
Optimal scaling of random-walk metropolis algorithms on general target distributions
One main limitation of the existing optimal scaling results for Metropolis–Hastings algorithms
is that the assumptions on the target distribution are unrealistic. In this paper, we consider …
is that the assumptions on the target distribution are unrealistic. In this paper, we consider …
Sparsity in Bayesian inversion of parametric operator equations
C Schillings, C Schwab - Inverse Problems, 2014 - iopscience.iop.org
We establish posterior sparsity in Bayesian inversion for systems governed by operator
equations with distributed parameter uncertainty subject to noisy observation data δ. We …
equations with distributed parameter uncertainty subject to noisy observation data δ. We …
Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits
GO Roberts, JS Rosenthal - Journal of Applied Probability, 2016 - cambridge.org
We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC)
algorithms to the computer science notion of algorithm complexity. Our main result states …
algorithms to the computer science notion of algorithm complexity. Our main result states …
Optimal scaling for the transient phase of Metropolis Hastings algorithms: the longtime behavior
B Jourdain, T Lelièvre, B Miasojedow - 2014 - projecteuclid.org
Abstract We consider the Random Walk Metropolis algorithm on R^n with Gaussian
proposals, and when the target probability measure is the n-fold product of a one …
proposals, and when the target probability measure is the n-fold product of a one …