A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Clad: A realistic continual learning benchmark for autonomous driving

E Verwimp, K Yang, S Parisot, L Hong, S McDonagh… - Neural Networks, 2023 - Elsevier
In this paper we describe the design and the ideas motivating a new Continual Learning
benchmark for Autonomous Driving (CLAD), that focuses on the problems of object …

Foster: Feature boosting and compression for class-incremental learning

FY Wang, DW Zhou, HJ Ye, DC Zhan - European conference on computer …, 2022 - Springer
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …

Deep class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye, DC Zhan… - arXiv preprint arXiv …, 2023 - arxiv.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

S-prompts learning with pre-trained transformers: An occam's razor for domain incremental learning

Y Wang, Z Huang, X Hong - Advances in Neural …, 2022 - proceedings.neurips.cc
State-of-the-art deep neural networks are still struggling to address the catastrophic
forgetting problem in continual learning. In this paper, we propose one simple paradigm …

Coda-prompt: Continual decomposed attention-based prompting for rehearsal-free continual learning

JS Smith, L Karlinsky, V Gutta… - Proceedings of the …, 2023 - openaccess.thecvf.com
Computer vision models suffer from a phenomenon known as catastrophic forgetting when
learning novel concepts from continuously shifting training data. Typical solutions for this …

A model or 603 exemplars: Towards memory-efficient class-incremental learning

DW Zhou, QW Wang, HJ Ye, DC Zhan - arXiv preprint arXiv:2205.13218, 2022 - arxiv.org
Real-world applications require the classification model to adapt to new classes without
forgetting old ones. Correspondingly, Class-Incremental Learning (CIL) aims to train a …

Preventing zero-shot transfer degradation in continual learning of vision-language models

Z Zheng, M Ma, K Wang, Z Qin… - Proceedings of the …, 2023 - openaccess.thecvf.com
Continual learning (CL) can help pre-trained vision-language models efficiently adapt to
new or under-trained data distributions without re-training. Nevertheless, during the …

Fine-tuned language models are continual learners

T Scialom, T Chakrabarty, S Muresan - arXiv preprint arXiv:2205.12393, 2022 - arxiv.org
Recent work on large language models relies on the intuition that most natural language
processing tasks can be described via natural language instructions. Language models …

Heterogeneous forgetting compensation for class-incremental learning

J Dong, W Liang, Y Cong… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Class-incremental learning (CIL) has achieved remarkable successes in learning new
classes consecutively while overcoming catastrophic forgetting on old categories. However …