Stochastic variational deep kernel learning
Deep kernel learning combines the non-parametric flexibility of kernel methods with the
inductive biases of deep learning architectures. We propose a novel deep kernel learning …
inductive biases of deep learning architectures. We propose a novel deep kernel learning …
Variational Fourier features for Gaussian processes
This work brings together two powerful concepts in Gaussian processes: the variational
approach to sparse approximation and the spectral representation of Gaussian processes …
approach to sparse approximation and the spectral representation of Gaussian processes …
Functional regularisation for continual learning with gaussian processes
We introduce a framework for Continual Learning (CL) based on Bayesian inference over
the function space rather than the parameters of a deep neural network. This method …
the function space rather than the parameters of a deep neural network. This method …
A unifying framework for Gaussian process pseudo-point approximations using power expectation propagation
Gaussian processes (GPs) are flexible distributions over functions that enable highlevel
assumptions about unknown functions to be encoded in a parsimonious, flexible and …
assumptions about unknown functions to be encoded in a parsimonious, flexible and …
Heterogeneous multi-output Gaussian process prediction
P Moreno-Muñoz, A Artés… - Advances in neural …, 2018 - proceedings.neurips.cc
We present a novel extension of multi-output Gaussian processes for handling
heterogeneous outputs. We assume that each output has its own likelihood function and use …
heterogeneous outputs. We assume that each output has its own likelihood function and use …
Streaming sparse Gaussian process approximations
Sparse pseudo-point approximations for Gaussian process (GP) models provide a suite of
methods that support deployment of GPs in the large data regime and enable analytic …
methods that support deployment of GPs in the large data regime and enable analytic …
A stochastic variational framework for recurrent Gaussian processes models
CLC Mattos, GA Barreto - Neural Networks, 2019 - Elsevier
Abstract Gaussian Processes (GPs) models have been successfully applied to the problem
of learning from sequential observations. In such context, the family of Recurrent Gaussian …
of learning from sequential observations. In such context, the family of Recurrent Gaussian …
Transforming Gaussian processes with normalizing flows
J Maroñas, O Hamelijnck… - International …, 2021 - proceedings.mlr.press
Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired
by the growing body of work on Normalizing Flows, we enlarge this class of priors through a …
by the growing body of work on Normalizing Flows, we enlarge this class of priors through a …
Chained gaussian processes
Gaussian process models are flexible, Bayesian non-parametric approaches to regression.
Properties of multivariate Gaussians mean that they can be combined linearly in the manner …
Properties of multivariate Gaussians mean that they can be combined linearly in the manner …
Doubly stochastic variational inference for neural processes with hierarchical latent variables
Q Wang, H Van Hoof - International Conference on Machine …, 2020 - proceedings.mlr.press
Neural processes (NPs) constitute a family of variational approximate models for stochastic
processes with promising properties in computational efficiency and uncertainty …
processes with promising properties in computational efficiency and uncertainty …