Random features for kernel approximation: A survey on algorithms, theory, and beyond

F Liu, X Huang, Y Chen… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
The class of random features is one of the most popular techniques to speed up kernel
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …

Postprocessing of MCMC

LF South, M Riabiz, O Teymur… - Annual Review of …, 2022 - annualreviews.org
Markov chain Monte Carlo is the engine of modern Bayesian statistics, being used to
approximate the posterior and derived quantities of interest. Despite this, the issue of how …

Random feature Stein discrepancies

J Huggins, L Mackey - Advances in neural information …, 2018 - proceedings.neurips.cc
Computable Stein discrepancies have been deployed for a variety of applications, ranging
from sampler selection in posterior inference to approximate Bayesian inference to …

On the positivity and magnitudes of Bayesian quadrature weights

T Karvonen, M Kanagawa, S Särkkä - Statistics and Computing, 2019 - Springer
This article reviews and studies the properties of Bayesian quadrature weights, which
strongly affect stability and robustness of the quadrature rule. Specifically, we investigate …

Convergence rates for a class of estimators based on Stein's method

CJ Oates, J Cockayne, FX Briol, M Girolami - 2019 - projecteuclid.org
Convergence rates for a class of estimators based on Stein's method Page 1 Bernoulli 25(2),
2019, 1141–1159 https://doi.org/10.3150/17-BEJ1016 Convergence rates for a class of …

Bayesian quadrature for multiple related integrals

X Xi, FX Briol, M Girolami - International Conference on …, 2018 - proceedings.mlr.press
Bayesian probabilistic numerical methods are a set of tools providing posterior distributions
on the output of numerical methods. The use of these methods is usually motivated by the …

Stein -Importance Sampling

C Wang, Y Chen, H Kanagawa… - Advances in Neural …, 2024 - proceedings.neurips.cc
Stein discrepancies have emerged as a powerful tool for retrospective improvement of
Markov chain Monte Carlo output. However, the question of how to design Markov chains …

Convergence analysis of deterministic kernel-based quadrature rules in misspecified settings

M Kanagawa, BK Sriperumbudur… - Foundations of …, 2020 - Springer
This paper presents convergence analysis of kernel-based quadrature rules in misspecified
settings, focusing on deterministic quadrature in Sobolev spaces. In particular, we deal with …

Vector-valued control variates

Z Sun, A Barp, FX Briol - International Conference on …, 2023 - proceedings.mlr.press
Control variates are variance reduction tools for Monte Carlo estimators. They can provide
significant variance reduction, but usually require a large number of samples, which can be …

Geometrically coupled monte carlo sampling

M Rowland, KM Choromanski… - Advances in …, 2018 - proceedings.neurips.cc
Monte Carlo sampling in high-dimensional, low-sample settings is important in many
machine learning tasks. We improve current methods for sampling in Euclidean spaces by …