Random features for kernel approximation: A survey on algorithms, theory, and beyond
The class of random features is one of the most popular techniques to speed up kernel
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …
Postprocessing of MCMC
Markov chain Monte Carlo is the engine of modern Bayesian statistics, being used to
approximate the posterior and derived quantities of interest. Despite this, the issue of how …
approximate the posterior and derived quantities of interest. Despite this, the issue of how …
Random feature Stein discrepancies
Computable Stein discrepancies have been deployed for a variety of applications, ranging
from sampler selection in posterior inference to approximate Bayesian inference to …
from sampler selection in posterior inference to approximate Bayesian inference to …
On the positivity and magnitudes of Bayesian quadrature weights
This article reviews and studies the properties of Bayesian quadrature weights, which
strongly affect stability and robustness of the quadrature rule. Specifically, we investigate …
strongly affect stability and robustness of the quadrature rule. Specifically, we investigate …
Convergence rates for a class of estimators based on Stein's method
Convergence rates for a class of estimators based on Stein's method Page 1 Bernoulli 25(2),
2019, 1141–1159 https://doi.org/10.3150/17-BEJ1016 Convergence rates for a class of …
2019, 1141–1159 https://doi.org/10.3150/17-BEJ1016 Convergence rates for a class of …
Bayesian quadrature for multiple related integrals
Bayesian probabilistic numerical methods are a set of tools providing posterior distributions
on the output of numerical methods. The use of these methods is usually motivated by the …
on the output of numerical methods. The use of these methods is usually motivated by the …
Stein -Importance Sampling
Stein discrepancies have emerged as a powerful tool for retrospective improvement of
Markov chain Monte Carlo output. However, the question of how to design Markov chains …
Markov chain Monte Carlo output. However, the question of how to design Markov chains …
Convergence analysis of deterministic kernel-based quadrature rules in misspecified settings
M Kanagawa, BK Sriperumbudur… - Foundations of …, 2020 - Springer
This paper presents convergence analysis of kernel-based quadrature rules in misspecified
settings, focusing on deterministic quadrature in Sobolev spaces. In particular, we deal with …
settings, focusing on deterministic quadrature in Sobolev spaces. In particular, we deal with …
Vector-valued control variates
Control variates are variance reduction tools for Monte Carlo estimators. They can provide
significant variance reduction, but usually require a large number of samples, which can be …
significant variance reduction, but usually require a large number of samples, which can be …
Geometrically coupled monte carlo sampling
M Rowland, KM Choromanski… - Advances in …, 2018 - proceedings.neurips.cc
Monte Carlo sampling in high-dimensional, low-sample settings is important in many
machine learning tasks. We improve current methods for sampling in Euclidean spaces by …
machine learning tasks. We improve current methods for sampling in Euclidean spaces by …