A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
Ranpac: Random projections and pre-trained models for continual learning
MD McDonnell, D Gong, A Parvaneh… - Advances in …, 2024 - proceedings.neurips.cc
Continual learning (CL) aims to incrementally learn different tasks (such as classification) in
a non-stationary data stream without forgetting old ones. Most CL works focus on tackling …
a non-stationary data stream without forgetting old ones. Most CL works focus on tackling …
Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
Online continual learning without the storage constraint
Traditional online continual learning (OCL) research has primarily focused on mitigating
catastrophic forgetting with fixed and limited storage allocation throughout an agent's …
catastrophic forgetting with fixed and limited storage allocation throughout an agent's …
Continual learning with pre-trained models: A survey
Nowadays, real-world applications often face streaming data, which requires the learning
system to absorb new knowledge as data evolves. Continual Learning (CL) aims to achieve …
system to absorb new knowledge as data evolves. Continual Learning (CL) aims to achieve …
Guiding the last layer in federated learning with pre-trained models
G Legate, N Bernier, L Page-Caccia… - Advances in …, 2024 - proceedings.neurips.cc
Federated Learning (FL) is an emerging paradigm that allows a model to be trained across a
number of participants without sharing data. Recent works have begun to consider the …
number of participants without sharing data. Recent works have begun to consider the …
Overcoming Generic Knowledge Loss with Selective Parameter Update
Foundation models encompass an extensive knowledge base and offer remarkable
transferability. However this knowledge becomes outdated or insufficient over time. The …
transferability. However this knowledge becomes outdated or insufficient over time. The …
An analysis of initial training strategies for exemplar-free class-incremental learning
Abstract Class-Incremental Learning (CIL) aims to build classification models from data
streams. At each step of the CIL process, new classes must be integrated into the model …
streams. At each step of the CIL process, new classes must be integrated into the model …
Rapid Adaptation in Online Continual Learning: Are We Evaluating It Right?
We revisit the common practice of evaluating adaptation of Online Continual Learning (OCL)
algorithms through the metric of online accuracy, which measures the accuracy of the model …
algorithms through the metric of online accuracy, which measures the accuracy of the model …
Continually learning representations at scale
Many widely used continual learning benchmarks follow a protocol that starts from an
untrained, randomly initialized model that needs to sequentially learn a number of incoming …
untrained, randomly initialized model that needs to sequentially learn a number of incoming …