Kernel mean embedding of distributions: A review and beyond

K Muandet, K Fukumizu… - … and Trends® in …, 2017 - nowpublishers.com
A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has
recently emerged as a powerful tool for machine learning and statistical inference. The basic …

Adaptive Monte Carlo augmented with normalizing flows

M Gabrié, GM Rotskoff… - Proceedings of the …, 2022 - National Acad Sciences
Many problems in the physical sciences, machine learning, and statistical inference
necessitate sampling from a high-dimensional, multimodal probability distribution. Markov …

Active learning for deep Gaussian process surrogates

A Sauer, RB Gramacy, D Higdon - Technometrics, 2023 - Taylor & Francis
Abstract Deep Gaussian processes (DGPs) are increasingly popular as predictive models in
machine learning for their nonstationary flexibility and ability to cope with abrupt regime …

Generalizing hamiltonian monte carlo with neural networks

D Levy, MD Hoffman, J Sohl-Dickstein - arXiv preprint arXiv:1711.09268, 2017 - arxiv.org
We present a general-purpose method to train Markov chain Monte Carlo kernels,
parameterized by deep neural networks, that converge and mix quickly to their target …

Eigendecompositions of transfer operators in reproducing kernel Hilbert spaces

S Klus, I Schuster, K Muandet - Journal of Nonlinear Science, 2020 - Springer
Transfer operators such as the Perron–Frobenius or Koopman operator play an important
role in the global analysis of complex dynamical systems. The eigenfunctions of these …

[图书][B] Bayesian modeling and computation in Python

OA Martin, R Kumar, J Lao - 2021 - taylorfrancis.com
Bayesian Modeling and Computation in Python aims to help beginner Bayesian
practitioners to become intermediate modelers. It uses a hands on approach with PyMC3 …

A spectral approach to gradient estimation for implicit distributions

J Shi, S Sun, J Zhu - International Conference on Machine …, 2018 - proceedings.mlr.press
Recently there have been increasing interests in learning and inference with implicit
distributions (ie, distributions without tractable densities). To this end, we develop a gradient …

K2-ABC: Approximate Bayesian computation with kernel embeddings

M Park, W Jitkrittum… - Artificial intelligence and …, 2016 - proceedings.mlr.press
Complicated generative models often result in a situation where computing the likelihood of
observed data is intractable, while simulating from the conditional density given a parameter …

Gradient-free Hamiltonian Monte Carlo with efficient kernel exponential families

H Strathmann, D Sejdinovic… - Advances in …, 2015 - proceedings.neurips.cc
Abstract We propose Kernel Hamiltonian Monte Carlo (KMC), a gradient-free adaptive
MCMC algorithm based on Hamiltonian Monte Carlo (HMC). On target densities where …

RKHS-SHAP: Shapley values for kernel methods

SL Chau, R Hu, J Gonzalez… - Advances in neural …, 2022 - proceedings.neurips.cc
Feature attribution for kernel methods is often heuristic and not individualised for each
prediction. To address this, we turn to the concept of Shapley values (SV), a coalition game …