Ddgr: Continual learning with deep diffusion-based generative replay

R Gao, W Liu - International Conference on Machine …, 2023 - proceedings.mlr.press
Popular deep-learning models in the field of image classification suffer from catastrophic
forgetting—models will forget previously acquired skills when learning new ones …

Generating instance-level prompts for rehearsal-free continual learning

D Jung, D Han, J Bang, H Song - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Abstract We introduce Domain-Adaptive Prompt (DAP), a novel method for continual
learning using Vision Transformers (ViT). Prompt-based continual learning has recently …

Generative replay with feedback connections as a general strategy for continual learning

GM Van de Ven, AS Tolias - arXiv preprint arXiv:1809.10635, 2018 - arxiv.org
A major obstacle to developing artificial intelligence applications capable of true lifelong
learning is that artificial neural networks quickly or catastrophically forget previously learned …

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …

Supervised contrastive replay: Revisiting the nearest class mean classifier in online class-incremental continual learning

Z Mai, R Li, H Kim, S Sanner - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Online class-incremental continual learning (CL) studies the problem of learning new
classes continually from an online non-stationary data stream, intending to adapt to new …

Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning

D Goswami, Y Liu, B Twardowski… - Advances in Neural …, 2024 - proceedings.neurips.cc
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …

Multi-criteria selection of rehearsal samples for continual learning

C Zhuang, S Huang, G Cheng, J Ning - Pattern Recognition, 2022 - Elsevier
Retaining a small subset to replay is a direct and effective way to prevent catastrophic
forgetting in continual learning. However, due to data complexity and restricted memory …

Rehearsal revealed: The limits and merits of revisiting samples in continual learning

E Verwimp, M De Lange… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Learning from non-stationary data streams and overcoming catastrophic forgetting still
poses a serious challenge for machine learning research. Rather than aiming to improve …

Generative feature replay for class-incremental learning

X Liu, C Wu, M Menta, L Herranz… - Proceedings of the …, 2020 - openaccess.thecvf.com
Humans are capable of learning new tasks without forgetting previous ones, while neural
networks fail due to catastrophic forgetting between new and previously-learned tasks. We …

Looking through the past: better knowledge retention for generative replay in continual learning

V Khan, S Cygert, B Twardowski… - Proceedings of the …, 2023 - openaccess.thecvf.com
In this work, we improve the generative replay in a continual learning setting. We notice that
in VAE-based generative replay, the generated features are quite far from the original ones …