Stochastic variational deep kernel learning

AG Wilson, Z Hu… - Advances in neural …, 2016 - proceedings.neurips.cc
Deep kernel learning combines the non-parametric flexibility of kernel methods with the
inductive biases of deep learning architectures. We propose a novel deep kernel learning …

Variational Fourier features for Gaussian processes

J Hensman, N Durrande, A Solin - Journal of Machine Learning Research, 2018 - jmlr.org
This work brings together two powerful concepts in Gaussian processes: the variational
approach to sparse approximation and the spectral representation of Gaussian processes …

Functional regularisation for continual learning with gaussian processes

MK Titsias, J Schwarz, AGG Matthews… - arXiv preprint arXiv …, 2019 - arxiv.org
We introduce a framework for Continual Learning (CL) based on Bayesian inference over
the function space rather than the parameters of a deep neural network. This method …

A unifying framework for Gaussian process pseudo-point approximations using power expectation propagation

TD Bui, J Yan, RE Turner - Journal of Machine Learning Research, 2017 - jmlr.org
Gaussian processes (GPs) are flexible distributions over functions that enable highlevel
assumptions about unknown functions to be encoded in a parsimonious, flexible and …

Heterogeneous multi-output Gaussian process prediction

P Moreno-Muñoz, A Artés… - Advances in neural …, 2018 - proceedings.neurips.cc
We present a novel extension of multi-output Gaussian processes for handling
heterogeneous outputs. We assume that each output has its own likelihood function and use …

Streaming sparse Gaussian process approximations

TD Bui, C Nguyen, RE Turner - Advances in Neural …, 2017 - proceedings.neurips.cc
Sparse pseudo-point approximations for Gaussian process (GP) models provide a suite of
methods that support deployment of GPs in the large data regime and enable analytic …

A stochastic variational framework for recurrent Gaussian processes models

CLC Mattos, GA Barreto - Neural Networks, 2019 - Elsevier
Abstract Gaussian Processes (GPs) models have been successfully applied to the problem
of learning from sequential observations. In such context, the family of Recurrent Gaussian …

Transforming Gaussian processes with normalizing flows

J Maroñas, O Hamelijnck… - International …, 2021 - proceedings.mlr.press
Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired
by the growing body of work on Normalizing Flows, we enlarge this class of priors through a …

Chained gaussian processes

AD Saul, J Hensman, A Vehtari… - Artificial intelligence …, 2016 - proceedings.mlr.press
Gaussian process models are flexible, Bayesian non-parametric approaches to regression.
Properties of multivariate Gaussians mean that they can be combined linearly in the manner …

Doubly stochastic variational inference for neural processes with hierarchical latent variables

Q Wang, H Van Hoof - International Conference on Machine …, 2020 - proceedings.mlr.press
Neural processes (NPs) constitute a family of variational approximate models for stochastic
processes with promising properties in computational efficiency and uncertainty …