When Gaussian process meets big data: A review of scalable GPs
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …
hardware encourages success stories in the machine learning community. In the …
Priors in bayesian deep learning: A review
V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …
recent Bayesian deep learning models have often fallen back on vague priors, such as …
An optimization-centric view on Bayes' rule: Reviewing and generalizing variational inference
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the
representation of Bayes' rule as infinite-dimensional optimization (Csisz´ r, 1975; Donsker …
representation of Bayes' rule as infinite-dimensional optimization (Csisz´ r, 1975; Donsker …
Doubly stochastic variational inference for deep Gaussian processes
H Salimbeni, M Deisenroth - Advances in neural information …, 2017 - proceedings.neurips.cc
Abstract Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but
inference in these models has proved challenging. Existing approaches to inference in DGP …
inference in these models has proved challenging. Existing approaches to inference in DGP …
Functional variational Bayesian neural networks
Variational Bayesian neural networks (BNNs) perform variational inference over weights, but
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …
Conventional and contemporary approaches used in text to speech synthesis: A review
Nowadays speech synthesis or text to speech (TTS), an ability of system to produce human
like natural sounding voice from the written text, is gaining popularity in the field of speech …
like natural sounding voice from the written text, is gaining popularity in the field of speech …
Kernel methods through the roof: handling billions of points efficiently
Kernel methods provide an elegant and principled approach to nonparametric learning, but
so far could hardly be used in large scale problems, since naïve implementations scale …
so far could hardly be used in large scale problems, since naïve implementations scale …
Adversarial examples, uncertainty, and transfer testing robustness in gaussian process hybrid deep networks
J Bradshaw, AGG Matthews, Z Ghahramani - arXiv preprint arXiv …, 2017 - arxiv.org
Deep neural networks (DNNs) have excellent representative power and are state of the art
classifiers on many tasks. However, they often do not capture their own uncertainties well …
classifiers on many tasks. However, they often do not capture their own uncertainties well …
Bayesian layers: A module for neural network uncertainty
D Tran, M Dusenberry… - Advances in neural …, 2019 - proceedings.neurips.cc
Abstract We describe Bayesian Layers, a module designed for fast experimentation with
neural network uncertainty. It extends neural network libraries with drop-in replacements for …
neural network uncertainty. It extends neural network libraries with drop-in replacements for …
Deep gaussian processes for multi-fidelity modeling
Multi-fidelity methods are prominently used when cheaply-obtained, but possibly biased and
noisy, observations must be effectively combined with limited or expensive true data in order …
noisy, observations must be effectively combined with limited or expensive true data in order …