A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

[HTML][HTML] Embracing change: Continual learning in deep neural networks

R Hadsell, D Rao, AA Rusu, R Pascanu - Trends in cognitive sciences, 2020 - cell.com
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …

Deep class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye, DC Zhan… - arXiv preprint arXiv …, 2023 - arxiv.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Class-incremental learning by knowledge distillation with adaptive feature consolidation

M Kang, J Park, B Han - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Cafe: Learning to condense dataset by aligning features

K Wang, B Zhao, X Peng, Z Zhu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Dataset condensation aims at reducing the network training effort through condensing a
cumbersome training set into a compact synthetic one. State-of-the-art approaches largely …

Robust test-time adaptation in dynamic scenarios

L Yuan, B Xie, S Li - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Test-time adaptation (TTA) intends to adapt the pretrained model to test distributions with
only unlabeled test data streams. Most of the previous TTA methods have achieved great …

Co2l: Contrastive continual learning

H Cha, J Lee, J Shin - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Recent breakthroughs in self-supervised learning show that such algorithms learn visual
representations that can be transferred better to unseen tasks than cross-entropy based …

Dataset condensation with distribution matching

B Zhao, H Bilen - Proceedings of the IEEE/CVF Winter …, 2023 - openaccess.thecvf.com
Computational cost of training state-of-the-art deep models in many learning problems is
rapidly increasing due to more sophisticated models and larger datasets. A recent promising …

Class-incremental learning: survey and performance evaluation on image classification

M Masana, X Liu, B Twardowski… - … on Pattern Analysis …, 2022 - ieeexplore.ieee.org
For future learning systems, incremental learning is desirable because it allows for: efficient
resource usage by eliminating the need to retrain from scratch at the arrival of new data; …

Dataset distillation using neural feature regression

Y Zhou, E Nezhadarya, J Ba - Advances in Neural …, 2022 - proceedings.neurips.cc
Dataset distillation aims to learn a small synthetic dataset that preserves most of the
information from the original dataset. Dataset distillation can be formulated as a bi-level …