A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
[HTML][HTML] Embracing change: Continual learning in deep neural networks
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
Deep class-incremental learning: A survey
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
Class-incremental learning by knowledge distillation with adaptive feature consolidation
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …
which continually learns new tasks with limited memory for storing examples in the previous …
Cafe: Learning to condense dataset by aligning features
Dataset condensation aims at reducing the network training effort through condensing a
cumbersome training set into a compact synthetic one. State-of-the-art approaches largely …
cumbersome training set into a compact synthetic one. State-of-the-art approaches largely …
Robust test-time adaptation in dynamic scenarios
Test-time adaptation (TTA) intends to adapt the pretrained model to test distributions with
only unlabeled test data streams. Most of the previous TTA methods have achieved great …
only unlabeled test data streams. Most of the previous TTA methods have achieved great …
Co2l: Contrastive continual learning
Recent breakthroughs in self-supervised learning show that such algorithms learn visual
representations that can be transferred better to unseen tasks than cross-entropy based …
representations that can be transferred better to unseen tasks than cross-entropy based …
Dataset condensation with distribution matching
Computational cost of training state-of-the-art deep models in many learning problems is
rapidly increasing due to more sophisticated models and larger datasets. A recent promising …
rapidly increasing due to more sophisticated models and larger datasets. A recent promising …
Class-incremental learning: survey and performance evaluation on image classification
For future learning systems, incremental learning is desirable because it allows for: efficient
resource usage by eliminating the need to retrain from scratch at the arrival of new data; …
resource usage by eliminating the need to retrain from scratch at the arrival of new data; …
Dataset distillation using neural feature regression
Dataset distillation aims to learn a small synthetic dataset that preserves most of the
information from the original dataset. Dataset distillation can be formulated as a bi-level …
information from the original dataset. Dataset distillation can be formulated as a bi-level …