[HTML][HTML] Embracing change: Continual learning in deep neural networks
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
[HTML][HTML] A survey on data‐efficient algorithms in big data era
A Adadi - Journal of Big Data, 2021 - Springer
The leading approaches in Machine Learning are notoriously data-hungry. Unfortunately,
many application domains do not have access to big data because acquiring data involves a …
many application domains do not have access to big data because acquiring data involves a …
Dualnet: Continual learning, fast and slow
Abstract According to Complementary Learning Systems (CLS) theory~\cite
{mcclelland1995there} in neuroscience, humans do effective\emph {continual learning} …
{mcclelland1995there} in neuroscience, humans do effective\emph {continual learning} …
New insights on reducing abrupt representation change in online continual learning
In the online continual learning paradigm, agents must learn from a changing distribution
while respecting memory and compute constraints. Experience Replay (ER), where a small …
while respecting memory and compute constraints. Experience Replay (ER), where a small …
Orthogonal gradient descent for continual learning
M Farajtabar, N Azizan, A Mott… - … Conference on Artificial …, 2020 - proceedings.mlr.press
Neural networks are achieving state of the art and sometimes super-human performance on
learning tasks across a variety of domains. Whenever these problems require learning in a …
learning tasks across a variety of domains. Whenever these problems require learning in a …
Probing representation forgetting in supervised and unsupervised continual learning
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …
Understanding the role of training regimes in continual learning
Catastrophic forgetting affects the training of neural networks, limiting their ability to learn
multiple tasks sequentially. From the perspective of the well established plasticity-stability …
multiple tasks sequentially. From the perspective of the well established plasticity-stability …
[HTML][HTML] ROSE: robust online self-adjusting ensemble for continual learning on imbalanced drifting data streams
A Cano, B Krawczyk - Machine Learning, 2022 - Springer
Data streams are potentially unbounded sequences of instances arriving over time to a
classifier. Designing algorithms that are capable of dealing with massive, rapidly arriving …
classifier. Designing algorithms that are capable of dealing with massive, rapidly arriving …
Continual learning via local module composition
O Ostapenko, P Rodriguez… - Advances in Neural …, 2021 - proceedings.neurips.cc
Modularity is a compelling solution to continual learning (CL), the problem of modeling
sequences of related tasks. Learning and then composing modules to solve different tasks …
sequences of related tasks. Learning and then composing modules to solve different tasks …
Continual learning for recurrent neural networks: an empirical evaluation
Learning continuously during all model lifetime is fundamental to deploy machine learning
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …