Ddgr: Continual learning with deep diffusion-based generative replay
R Gao, W Liu - International Conference on Machine …, 2023 - proceedings.mlr.press
Popular deep-learning models in the field of image classification suffer from catastrophic
forgetting—models will forget previously acquired skills when learning new ones …
forgetting—models will forget previously acquired skills when learning new ones …
Generating instance-level prompts for rehearsal-free continual learning
Abstract We introduce Domain-Adaptive Prompt (DAP), a novel method for continual
learning using Vision Transformers (ViT). Prompt-based continual learning has recently …
learning using Vision Transformers (ViT). Prompt-based continual learning has recently …
Generative replay with feedback connections as a general strategy for continual learning
GM Van de Ven, AS Tolias - arXiv preprint arXiv:1809.10635, 2018 - arxiv.org
A major obstacle to developing artificial intelligence applications capable of true lifelong
learning is that artificial neural networks quickly or catastrophically forget previously learned …
learning is that artificial neural networks quickly or catastrophically forget previously learned …
Learning to prompt for continual learning
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Supervised contrastive replay: Revisiting the nearest class mean classifier in online class-incremental continual learning
Online class-incremental continual learning (CL) studies the problem of learning new
classes continually from an online non-stationary data stream, intending to adapt to new …
classes continually from an online non-stationary data stream, intending to adapt to new …
Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
Multi-criteria selection of rehearsal samples for continual learning
Retaining a small subset to replay is a direct and effective way to prevent catastrophic
forgetting in continual learning. However, due to data complexity and restricted memory …
forgetting in continual learning. However, due to data complexity and restricted memory …
Rehearsal revealed: The limits and merits of revisiting samples in continual learning
E Verwimp, M De Lange… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Learning from non-stationary data streams and overcoming catastrophic forgetting still
poses a serious challenge for machine learning research. Rather than aiming to improve …
poses a serious challenge for machine learning research. Rather than aiming to improve …
Generative feature replay for class-incremental learning
Humans are capable of learning new tasks without forgetting previous ones, while neural
networks fail due to catastrophic forgetting between new and previously-learned tasks. We …
networks fail due to catastrophic forgetting between new and previously-learned tasks. We …
Looking through the past: better knowledge retention for generative replay in continual learning
In this work, we improve the generative replay in a continual learning setting. We notice that
in VAE-based generative replay, the generated features are quite far from the original ones …
in VAE-based generative replay, the generated features are quite far from the original ones …
相关搜索
- continual learning generative replay
- continual learning class distributions
- feedback connections generative replay
- knowledge retention generative replay
- continual learning mean classifier
- continual learning feedback connections
- continual learning instance level
- continual learning knowledge retention
- continual learning nearest class