Ordisco: Effective and efficient usage of incremental unlabeled data for semi-supervised continual learning
Continual learning usually assumes the incoming data are fully labeled, which might not be
applicable in real applications. In this work, we consider semi-supervised continual learning …
applicable in real applications. In this work, we consider semi-supervised continual learning …
Representational continuity for unsupervised continual learning
Continual learning (CL) aims to learn a sequence of tasks without forgetting the previously
acquired knowledge. However, recent CL advances are restricted to supervised continual …
acquired knowledge. However, recent CL advances are restricted to supervised continual …
Memory-efficient semi-supervised continual learning: The world is its own replay buffer
Rehearsal is a critical component for class-incremental continual learning, yet it requires a
substantial memory budget. Our work investigates whether we can significantly reduce this …
substantial memory budget. Our work investigates whether we can significantly reduce this …
[HTML][HTML] Continual semi-supervised learning through contrastive interpolation consistency
Continual Learning (CL) investigates how to train Deep Networks on a stream of tasks
without incurring forgetting. CL settings proposed in literature assume that every incoming …
without incurring forgetting. CL settings proposed in literature assume that every incoming …
Plasticity-optimized complementary networks for unsupervised continual learning
A Gomez-Villa, B Twardowski… - Proceedings of the …, 2024 - openaccess.thecvf.com
Continuous unsupervised representation learning (CURL) research has greatly benefited
from improvements in self-supervised learning (SSL) techniques. As a result, existing CURL …
from improvements in self-supervised learning (SSL) techniques. As a result, existing CURL …
Continual learning based on ood detection and task masking
Existing continual learning techniques focus on either task incremental learning (TIL) or
class incremental learning (CIL) problem, but not both. CIL and TIL differ mainly in that the …
class incremental learning (CIL) problem, but not both. CIL and TIL differ mainly in that the …
Real-time evaluation in online continual learning: A new hope
Abstract Current evaluations of Continual Learning (CL) methods typically assume that there
is no constraint on training time and computation. This is an unrealistic assumption for any …
is no constraint on training time and computation. This is an unrealistic assumption for any …
Online continual learning on a contaminated data stream with blurry task boundaries
Learning under a continuously changing data distribution with incorrect labels is a desirable
real-world problem yet challenging. Large body of continual learning (CL) methods …
real-world problem yet challenging. Large body of continual learning (CL) methods …
Learning to prompt for continual learning
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Slca: Slow learner with classifier alignment for continual learning on a pre-trained model
The goal of continual learning is to improve the performance of recognition models in
learning sequentially arrived data. Although most existing works are established on the …
learning sequentially arrived data. Although most existing works are established on the …