[HTML][HTML] Embracing change: Continual learning in deep neural networks

R Hadsell, D Rao, AA Rusu, R Pascanu - Trends in cognitive sciences, 2020 - cell.com
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …

[HTML][HTML] A survey on data‐efficient algorithms in big data era

A Adadi - Journal of Big Data, 2021 - Springer
The leading approaches in Machine Learning are notoriously data-hungry. Unfortunately,
many application domains do not have access to big data because acquiring data involves a …

Dualnet: Continual learning, fast and slow

Q Pham, C Liu, S Hoi - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Abstract According to Complementary Learning Systems (CLS) theory~\cite
{mcclelland1995there} in neuroscience, humans do effective\emph {continual learning} …

New insights on reducing abrupt representation change in online continual learning

L Caccia, R Aljundi, N Asadi, T Tuytelaars… - arXiv preprint arXiv …, 2021 - arxiv.org
In the online continual learning paradigm, agents must learn from a changing distribution
while respecting memory and compute constraints. Experience Replay (ER), where a small …

Orthogonal gradient descent for continual learning

M Farajtabar, N Azizan, A Mott… - … Conference on Artificial …, 2020 - proceedings.mlr.press
Neural networks are achieving state of the art and sometimes super-human performance on
learning tasks across a variety of domains. Whenever these problems require learning in a …

Probing representation forgetting in supervised and unsupervised continual learning

MR Davari, N Asadi, S Mudur… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …

Understanding the role of training regimes in continual learning

SI Mirzadeh, M Farajtabar, R Pascanu… - Advances in …, 2020 - proceedings.neurips.cc
Catastrophic forgetting affects the training of neural networks, limiting their ability to learn
multiple tasks sequentially. From the perspective of the well established plasticity-stability …

[HTML][HTML] ROSE: robust online self-adjusting ensemble for continual learning on imbalanced drifting data streams

A Cano, B Krawczyk - Machine Learning, 2022 - Springer
Data streams are potentially unbounded sequences of instances arriving over time to a
classifier. Designing algorithms that are capable of dealing with massive, rapidly arriving …

Continual learning via local module composition

O Ostapenko, P Rodriguez… - Advances in Neural …, 2021 - proceedings.neurips.cc
Modularity is a compelling solution to continual learning (CL), the problem of modeling
sequences of related tasks. Learning and then composing modules to solve different tasks …

Continual learning for recurrent neural networks: an empirical evaluation

A Cossu, A Carta, V Lomonaco, D Bacciu - Neural Networks, 2021 - Elsevier
Learning continuously during all model lifetime is fundamental to deploy machine learning
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …