[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021 - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

Efficiently sampling functions from Gaussian process posteriors

J Wilson, V Borovitskiy, A Terenin… - International …, 2020 - proceedings.mlr.press
Gaussian processes are the gold standard for many real-world modeling problems,
especially in cases where a model's success hinges upon its ability to faithfully represent …

Sparse Gaussian processes with spherical harmonic features

V Dutordoir, N Durrande… - … Conference on Machine …, 2020 - proceedings.mlr.press
We introduce a new class of inter-domain variational Gaussian processes (GP) where data
is mapped onto the unit hypersphere in order to use spherical harmonic representations …

Pathwise conditioning of Gaussian processes

JT Wilson, V Borovitskiy, A Terenin… - Journal of Machine …, 2021 - jmlr.org
As Gaussian processes are used to answer increasingly complex questions, analytic
solutions become scarcer and scarcer. Monte Carlo methods act as a convenient bridge for …

Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes

SW Ober, L Aitchison - International Conference on Machine …, 2021 - proceedings.mlr.press
We consider the optimal approximate posterior over the top-layer weights in a Bayesian
neural network for regression, and show that it exhibits strong dependencies on the lower …

Conditioning sparse variational Gaussian processes for online decision-making

WJ Maddox, S Stanton… - Advances in Neural …, 2021 - proceedings.neurips.cc
With a principled representation of uncertainty and closed form posterior updates, Gaussian
processes (GPs) are a natural choice for online decision making. However, Gaussian …

Sparse Gaussian processes revisited: Bayesian approaches to inducing-variable approximations

S Rossi, M Heinonen, E Bonilla… - International …, 2021 - proceedings.mlr.press
Variational inference techniques based on inducing variables provide an elegant framework
for scalable posterior estimation in Gaussian process (GP) models. Besides enabling …

Task-agnostic amortized inference of gaussian process hyperparameters

S Liu, X Sun, PJ Ramadge… - Advances in Neural …, 2020 - proceedings.neurips.cc
Gaussian processes (GPs) are flexible priors for modeling functions. However, their success
depends on the kernel accurately reflecting the properties of the data. One of the appeals of …

Tighter bounds on the log marginal likelihood of Gaussian process regression using conjugate gradients

A Artemev, DR Burt… - … Conference on Machine …, 2021 - proceedings.mlr.press
We propose a lower bound on the log marginal likelihood of Gaussian process regression
models that can be computed without matrix factorisation of the full kernel matrix. We show …

[HTML][HTML] Variational bayesian approximation of inverse problems using sparse precision matrices

J Povala, I Kazlauskaite, E Febrianto, F Cirak… - Computer Methods in …, 2022 - Elsevier
Inverse problems involving partial differential equations (PDEs) are widely used in science
and engineering. Although such problems are generally ill-posed, different regularisation …