When Gaussian process meets big data: A review of scalable GPs

H Liu, YS Ong, X Shen, J Cai - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

An optimization-centric view on Bayes' rule: Reviewing and generalizing variational inference

J Knoblauch, J Jewson, T Damoulas - Journal of Machine Learning …, 2022 - jmlr.org
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the
representation of Bayes' rule as infinite-dimensional optimization (Csisz´ r, 1975; Donsker …

Doubly stochastic variational inference for deep Gaussian processes

H Salimbeni, M Deisenroth - Advances in neural information …, 2017 - proceedings.neurips.cc
Abstract Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but
inference in these models has proved challenging. Existing approaches to inference in DGP …

Functional variational Bayesian neural networks

S Sun, G Zhang, J Shi, R Grosse - arXiv preprint arXiv:1903.05779, 2019 - arxiv.org
Variational Bayesian neural networks (BNNs) perform variational inference over weights, but
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …

Conventional and contemporary approaches used in text to speech synthesis: A review

N Kaur, P Singh - Artificial Intelligence Review, 2023 - Springer
Nowadays speech synthesis or text to speech (TTS), an ability of system to produce human
like natural sounding voice from the written text, is gaining popularity in the field of speech …

Kernel methods through the roof: handling billions of points efficiently

G Meanti, L Carratino, L Rosasco… - Advances in Neural …, 2020 - proceedings.neurips.cc
Kernel methods provide an elegant and principled approach to nonparametric learning, but
so far could hardly be used in large scale problems, since naïve implementations scale …

Adversarial examples, uncertainty, and transfer testing robustness in gaussian process hybrid deep networks

J Bradshaw, AGG Matthews, Z Ghahramani - arXiv preprint arXiv …, 2017 - arxiv.org
Deep neural networks (DNNs) have excellent representative power and are state of the art
classifiers on many tasks. However, they often do not capture their own uncertainties well …

Bayesian layers: A module for neural network uncertainty

D Tran, M Dusenberry… - Advances in neural …, 2019 - proceedings.neurips.cc
Abstract We describe Bayesian Layers, a module designed for fast experimentation with
neural network uncertainty. It extends neural network libraries with drop-in replacements for …

Deep gaussian processes for multi-fidelity modeling

K Cutajar, M Pullin, A Damianou, N Lawrence… - arXiv preprint arXiv …, 2019 - arxiv.org
Multi-fidelity methods are prominently used when cheaply-obtained, but possibly biased and
noisy, observations must be effectively combined with limited or expensive true data in order …