Continual learning with tiny episodic memories

A Chaudhry, M Rohrbach, M Elhoseiny… - Workshop on Multi …, 2019 - ora.ox.ac.uk
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …

On tiny episodic memories in continual learning

A Chaudhry, M Rohrbach, M Elhoseiny… - arXiv preprint arXiv …, 2019 - arxiv.org
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …

Efficient continual learning with modular networks and task-driven priors

T Veniat, L Denoyer, MA Ranzato - arXiv preprint arXiv:2012.12631, 2020 - arxiv.org
Existing literature in Continual Learning (CL) has focused on overcoming catastrophic
forgetting, the inability of the learner to recall how to perform tasks observed in the past …

Beyond not-forgetting: Continual learning with backward knowledge transfer

S Lin, L Yang, D Fan, J Zhang - Advances in Neural …, 2022 - proceedings.neurips.cc
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve
the learning performance of both a new task andold'tasks by leveraging the forward …

Learning to learn without forgetting by maximizing transfer and minimizing interference

M Riemer, I Cases, R Ajemian, M Liu, I Rish… - arXiv preprint arXiv …, 2018 - arxiv.org
Lack of performance when it comes to continual learning over non-stationary distributions of
data remains a major challenge in scaling neural network learning to more human realistic …

Online continual learning under extreme memory constraints

E Fini, S Lathuiliere, E Sangineto, M Nabi… - Computer Vision–ECCV …, 2020 - Springer
Continual Learning (CL) aims to develop agents emulating the human ability to sequentially
learn new tasks while being able to retain knowledge obtained from past experiences. In this …

Dualprompt: Complementary prompting for rehearsal-free continual learning

Z Wang, Z Zhang, S Ebrahimi, R Sun, H Zhang… - … on Computer Vision, 2022 - Springer
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …

Memory-efficient semi-supervised continual learning: The world is its own replay buffer

J Smith, J Balloch, YC Hsu, Z Kira - 2021 International Joint …, 2021 - ieeexplore.ieee.org
Rehearsal is a critical component for class-incremental continual learning, yet it requires a
substantial memory budget. Our work investigates whether we can significantly reduce this …

Online fast adaptation and knowledge accumulation: a new approach to continual learning

M Caccia, P Rodriguez, O Ostapenko… - arXiv preprint arXiv …, 2020 - arxiv.org
Continual learning studies agents that learn from streams of tasks without forgetting previous
ones while adapting to new ones. Two recent continual-learning scenarios have opened …

Gradient-based editing of memory examples for online task-free continual learning

X Jin, A Sadhu, J Du, X Ren - Advances in Neural …, 2021 - proceedings.neurips.cc
We explore task-free continual learning (CL), in which a model is trained to avoid
catastrophic forgetting in the absence of explicit task boundaries or identities. Among many …