[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …
uncertainties during both optimization and decision making processes. They have been …
Efficiently sampling functions from Gaussian process posteriors
Gaussian processes are the gold standard for many real-world modeling problems,
especially in cases where a model's success hinges upon its ability to faithfully represent …
especially in cases where a model's success hinges upon its ability to faithfully represent …
Sparse Gaussian processes with spherical harmonic features
V Dutordoir, N Durrande… - … Conference on Machine …, 2020 - proceedings.mlr.press
We introduce a new class of inter-domain variational Gaussian processes (GP) where data
is mapped onto the unit hypersphere in order to use spherical harmonic representations …
is mapped onto the unit hypersphere in order to use spherical harmonic representations …
Pathwise conditioning of Gaussian processes
As Gaussian processes are used to answer increasingly complex questions, analytic
solutions become scarcer and scarcer. Monte Carlo methods act as a convenient bridge for …
solutions become scarcer and scarcer. Monte Carlo methods act as a convenient bridge for …
Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes
SW Ober, L Aitchison - International Conference on Machine …, 2021 - proceedings.mlr.press
We consider the optimal approximate posterior over the top-layer weights in a Bayesian
neural network for regression, and show that it exhibits strong dependencies on the lower …
neural network for regression, and show that it exhibits strong dependencies on the lower …
Conditioning sparse variational Gaussian processes for online decision-making
With a principled representation of uncertainty and closed form posterior updates, Gaussian
processes (GPs) are a natural choice for online decision making. However, Gaussian …
processes (GPs) are a natural choice for online decision making. However, Gaussian …
Sparse Gaussian processes revisited: Bayesian approaches to inducing-variable approximations
Variational inference techniques based on inducing variables provide an elegant framework
for scalable posterior estimation in Gaussian process (GP) models. Besides enabling …
for scalable posterior estimation in Gaussian process (GP) models. Besides enabling …
Task-agnostic amortized inference of gaussian process hyperparameters
Gaussian processes (GPs) are flexible priors for modeling functions. However, their success
depends on the kernel accurately reflecting the properties of the data. One of the appeals of …
depends on the kernel accurately reflecting the properties of the data. One of the appeals of …
Tighter bounds on the log marginal likelihood of Gaussian process regression using conjugate gradients
We propose a lower bound on the log marginal likelihood of Gaussian process regression
models that can be computed without matrix factorisation of the full kernel matrix. We show …
models that can be computed without matrix factorisation of the full kernel matrix. We show …
[HTML][HTML] Variational bayesian approximation of inverse problems using sparse precision matrices
Inverse problems involving partial differential equations (PDEs) are widely used in science
and engineering. Although such problems are generally ill-posed, different regularisation …
and engineering. Although such problems are generally ill-posed, different regularisation …