Advances in variational inference

C Zhang, J Bütepage, H Kjellström… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
Many modern unsupervised or semi-supervised machine learning algorithms rely on
Bayesian probabilistic models. These models are usually intractable and thus require …

Virtual adversarial training: a regularization method for supervised and semi-supervised learning

T Miyato, S Maeda, M Koyama… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
We propose a new regularization method based on virtual adversarial loss: a new measure
of local smoothness of the conditional label distribution given input. Virtual adversarial loss …

Online structured laplace approximations for overcoming catastrophic forgetting

H Ritter, A Botev, D Barber - Advances in Neural …, 2018 - proceedings.neurips.cc
We introduce the Kronecker factored online Laplace approximation for overcoming
catastrophic forgetting in neural networks. The method is grounded in a Bayesian online …

Streaming variational bayes

T Broderick, N Boyd, A Wibisono… - Advances in neural …, 2013 - proceedings.neurips.cc
We present SDA-Bayes, a framework for (S) treaming,(D) istributed,(A) synchronous
computation of a Bayesian posterior. The framework makes streaming updates to the …

Optimal continual learning has perfect memory and is np-hard

J Knoblauch, H Husain… - … Conference on Machine …, 2020 - proceedings.mlr.press
Continual Learning (CL) algorithms incrementally learn a predictor or representation across
multiple sequentially observed tasks. Designing CL algorithms that perform reliably and …

Measuring and regularizing networks in function space

AS Benjamin, D Rolnick, K Kording - arXiv preprint arXiv:1805.08289, 2018 - arxiv.org
To optimize a neural network one often thinks of optimizing its parameters, but it is ultimately
a matter of optimizing the function that maps inputs to outputs. Since a change in the …

Continual learning with bayesian neural networks for non-stationary data

R Kurle, B Cseke, A Klushyn… - International …, 2019 - openreview.net
This work addresses continual learning for non-stationary data, using Bayesian neural
networks and memory-based online variational Bayes. We represent the posterior …

Continual learning via sequential function-space variational inference

TGJ Rudner, FB Smith, Q Feng… - … on Machine Learning, 2022 - proceedings.mlr.press
Sequential Bayesian inference over predictive functions is a natural framework for continual
learning from streams of data. However, applying it to neural networks has proved …

Incremental local gaussian regression

F Meier, P Hennig, S Schaal - Advances in Neural …, 2014 - proceedings.neurips.cc
Locally weighted regression (LWR) was created as a nonparametric method that can
approximate a wide range of functions, is computationally efficient, and can learn continually …

Detecting and adapting to irregular distribution shifts in bayesian online learning

A Li, A Boyd, P Smyth, S Mandt - Advances in neural …, 2021 - proceedings.neurips.cc
We consider the problem of online learning in the presence of distribution shifts that occur at
an unknown rate and of unknown intensity. We derive a new Bayesian online inference …