Generating instance-level prompts for rehearsal-free continual learning
Abstract We introduce Domain-Adaptive Prompt (DAP), a novel method for continual
learning using Vision Transformers (ViT). Prompt-based continual learning has recently …
learning using Vision Transformers (ViT). Prompt-based continual learning has recently …
Learning to prompt for continual learning
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Introducing language guidance in prompt-based continual learning
Continual Learning aims to learn a single model on a sequence of tasks without having
access to data from previous tasks. The biggest challenge in the domain still remains …
access to data from previous tasks. The biggest challenge in the domain still remains …
Coda-prompt: Continual decomposed attention-based prompting for rehearsal-free continual learning
JS Smith, L Karlinsky, V Gutta… - Proceedings of the …, 2023 - openaccess.thecvf.com
Computer vision models suffer from a phenomenon known as catastrophic forgetting when
learning novel concepts from continuously shifting training data. Typical solutions for this …
learning novel concepts from continuously shifting training data. Typical solutions for this …
Dualprompt: Complementary prompting for rehearsal-free continual learning
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
A closer look at rehearsal-free continual learning
Continual learning is a setting where machine learning models learn novel concepts from
continuously shifting training data, while simultaneously avoiding degradation of knowledge …
continuously shifting training data, while simultaneously avoiding degradation of knowledge …
Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
A unified continual learning framework with general parameter-efficient tuning
The" pre-training-downstream adaptation" presents both new opportunities and challenges
for Continual Learning (CL). Although the recent state-of-the-art in CL is achieved through …
for Continual Learning (CL). Although the recent state-of-the-art in CL is achieved through …
CLR: Channel-wise lightweight reprogramming for continual learning
Continual learning aims to emulate the human ability to continually accumulate knowledge
over sequential tasks. The main challenge is to maintain performance on previously learned …
over sequential tasks. The main challenge is to maintain performance on previously learned …
Plasticity-optimized complementary networks for unsupervised continual learning
A Gomez-Villa, B Twardowski… - Proceedings of the …, 2024 - openaccess.thecvf.com
Continuous unsupervised representation learning (CURL) research has greatly benefited
from improvements in self-supervised learning (SSL) techniques. As a result, existing CURL …
from improvements in self-supervised learning (SSL) techniques. As a result, existing CURL …