Deep class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye, DC Zhan… - arXiv preprint arXiv …, 2023 - arxiv.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

A comprehensive survey of forgetting in deep learning beyond continual learning

Z Wang, E Yang, L Shen, H Huang - arXiv preprint arXiv:2307.09218, 2023 - arxiv.org
Forgetting refers to the loss or deterioration of previously acquired information or knowledge.
While the existing surveys on forgetting have primarily focused on continual learning …

A collective AI via lifelong learning and sharing at the edge

A Soltoggio, E Ben-Iwhiwhu, V Braverman… - Nature Machine …, 2024 - nature.com
One vision of a future artificial intelligence (AI) is where many separate units can learn
independently over a lifetime and share their knowledge with each other. The synergy …

Class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Learning without forgetting for vision-language models

DW Zhou, Y Zhang, J Ning, HJ Ye, DC Zhan… - arXiv preprint arXiv …, 2023 - arxiv.org
Class-Incremental Learning (CIL) or continual learning is a desired capability in the real
world, which requires a learning system to adapt to new tasks without forgetting former ones …

Continual learning: Applications and the road forward

E Verwimp, S Ben-David, M Bethge, A Cossu… - arXiv preprint arXiv …, 2023 - arxiv.org
Continual learning is a sub-field of machine learning, which aims to allow machine learning
models to continuously learn on new data, by accumulating knowledge without forgetting …

Pilot: A pre-trained model-based continual learning toolbox

HL Sun, DW Zhou, HJ Ye, DC Zhan - arXiv preprint arXiv:2309.07117, 2023 - arxiv.org
While traditional machine learning can effectively tackle a wide range of problems, it
primarily operates within a closed-world setting, which presents limitations when dealing …

Continual learning of large language models: A comprehensive survey

H Shi, Z Xu, H Wang, W Qin, W Wang, Y Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
The recent success of large language models (LLMs) trained on static, pre-collected,
general datasets has sparked numerous research directions and applications. One such …

Overcoming Generic Knowledge Loss with Selective Parameter Update

W Zhang, P Janson, R Aljundi… - Proceedings of the …, 2024 - openaccess.thecvf.com
Foundation models encompass an extensive knowledge base and offer remarkable
transferability. However this knowledge becomes outdated or insufficient over time. The …

Cost-effective on-device continual learning over memory hierarchy with Miro

X Ma, S Jeong, M Zhang, D Wang, J Choi… - Proceedings of the 29th …, 2023 - dl.acm.org
Continual learning (CL) trains NN models incrementally from a continuous stream of tasks.
To remember previously learned knowledge, prior studies store old samples over a memory …