Kernel mean embedding of distributions: A review and beyond

K Muandet, K Fukumizu… - … and Trends® in …, 2017 - nowpublishers.com
A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has
recently emerged as a powerful tool for machine learning and statistical inference. The basic …

Gaussian processes and kernel methods: A review on connections and equivalences

M Kanagawa, P Hennig, D Sejdinovic… - arXiv preprint arXiv …, 2018 - arxiv.org
This paper is an attempt to bridge the conceptual gaps between researchers working on the
two widely used approaches based on positive definite kernels: Bayesian learning or …

Benchmarking simulation-based inference

JM Lueckmann, J Boelts, D Greenberg… - International …, 2021 - proceedings.mlr.press
Recent advances in probabilistic modelling have led to a large number of simulation-based
inference algorithms which do not require numerical evaluation of likelihoods. However, a …

Stein variational gradient descent: A general purpose bayesian inference algorithm

Q Liu, D Wang - Advances in neural information processing …, 2016 - proceedings.neurips.cc
We propose a general purpose variational inference algorithm that forms a natural
counterpart of gradient descent for optimization. Our method iteratively transports a set of …

A kernelized Stein discrepancy for goodness-of-fit tests

Q Liu, J Lee, M Jordan - International conference on …, 2016 - proceedings.mlr.press
We derive a new discrepancy statistic for measuring differences between two probability
distributions based on combining Stein's identity and the reproducing kernel Hilbert space …

A survey of Monte Carlo methods for parameter estimation

D Luengo, L Martino, M Bugallo, V Elvira… - EURASIP Journal on …, 2020 - Springer
Statistical signal processing applications usually require the estimation of some parameters
of interest given a set of observed data. These estimates are typically obtained either by …

A universal approximation theorem of deep neural networks for expressing probability distributions

Y Lu, J Lu - Advances in neural information processing …, 2020 - proceedings.neurips.cc
This paper studies the universal approximation property of deep neural networks for
representing probability distributions. Given a target distribution $\pi $ and a source …

Stein variational gradient descent as gradient flow

Q Liu - Advances in neural information processing systems, 2017 - proceedings.neurips.cc
Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that
iteratively transports a set of particles to approximate given distributions, based on a …

Detecting out-of-distribution inputs to deep generative models using typicality

E Nalisnick, A Matsukawa, YW Teh… - arXiv preprint arXiv …, 2019 - arxiv.org
Recent work has shown that deep generative models can assign higher likelihood to out-of-
distribution data sets than to their training data (Nalisnick et al., 2019; Choi et al., 2019). We …

Measuring sample quality with kernels

J Gorham, L Mackey - International Conference on Machine …, 2017 - proceedings.mlr.press
Abstract Approximate Markov chain Monte Carlo (MCMC) offers the promise of more rapid
sampling at the cost of more biased inference. Since standard MCMC diagnostics fail to …