A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Ranpac: Random projections and pre-trained models for continual learning

MD McDonnell, D Gong, A Parvaneh… - Advances in …, 2024 - proceedings.neurips.cc
Continual learning (CL) aims to incrementally learn different tasks (such as classification) in
a non-stationary data stream without forgetting old ones. Most CL works focus on tackling …

Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning

D Goswami, Y Liu, B Twardowski… - Advances in Neural …, 2024 - proceedings.neurips.cc
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …

Online continual learning without the storage constraint

A Prabhu, Z Cai, P Dokania, P Torr, V Koltun… - arXiv preprint arXiv …, 2023 - arxiv.org
Traditional online continual learning (OCL) research has primarily focused on mitigating
catastrophic forgetting with fixed and limited storage allocation throughout an agent's …

Continual learning with pre-trained models: A survey

DW Zhou, HL Sun, J Ning, HJ Ye, DC Zhan - arXiv preprint arXiv …, 2024 - arxiv.org
Nowadays, real-world applications often face streaming data, which requires the learning
system to absorb new knowledge as data evolves. Continual Learning (CL) aims to achieve …

Guiding the last layer in federated learning with pre-trained models

G Legate, N Bernier, L Page-Caccia… - Advances in …, 2024 - proceedings.neurips.cc
Federated Learning (FL) is an emerging paradigm that allows a model to be trained across a
number of participants without sharing data. Recent works have begun to consider the …

Overcoming Generic Knowledge Loss with Selective Parameter Update

W Zhang, P Janson, R Aljundi… - Proceedings of the …, 2024 - openaccess.thecvf.com
Foundation models encompass an extensive knowledge base and offer remarkable
transferability. However this knowledge becomes outdated or insufficient over time. The …

An analysis of initial training strategies for exemplar-free class-incremental learning

G Petit, M Soumm, E Feillet… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) aims to build classification models from data
streams. At each step of the CIL process, new classes must be integrated into the model …

Rapid Adaptation in Online Continual Learning: Are We Evaluating It Right?

HAAK Hammoud, A Prabhu, SN Lim… - 2023 IEEE/CVF …, 2023 - ieeexplore.ieee.org
We revisit the common practice of evaluating adaptation of Online Continual Learning (OCL)
algorithms through the metric of online accuracy, which measures the accuracy of the model …

Continually learning representations at scale

A Galashov, J Mitrovic, D Tirumala… - Conference on …, 2023 - proceedings.mlr.press
Many widely used continual learning benchmarks follow a protocol that starts from an
untrained, randomly initialized model that needs to sequentially learn a number of incoming …