Memory-efficient semi-supervised continual learning: The world is its own replay buffer
Rehearsal is a critical component for class-incremental continual learning, yet it requires a
substantial memory budget. Our work investigates whether we can significantly reduce this …
substantial memory budget. Our work investigates whether we can significantly reduce this …
CUCL: Codebook for Unsupervised Continual Learning
The focus of this study is on Unsupervised Continual Learning (UCL), as it presents an
alternative to Supervised Continual Learning which needs high-quality manual labeled data …
alternative to Supervised Continual Learning which needs high-quality manual labeled data …
Representational continuity for unsupervised continual learning
Continual learning (CL) aims to learn a sequence of tasks without forgetting the previously
acquired knowledge. However, recent CL advances are restricted to supervised continual …
acquired knowledge. However, recent CL advances are restricted to supervised continual …
Ordisco: Effective and efficient usage of incremental unlabeled data for semi-supervised continual learning
Continual learning usually assumes the incoming data are fully labeled, which might not be
applicable in real applications. In this work, we consider semi-supervised continual learning …
applicable in real applications. In this work, we consider semi-supervised continual learning …
On tiny episodic memories in continual learning
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
Regularizing second-order influences for continual learning
Continual learning aims to learn on non-stationary data streams without catastrophically
forgetting previous knowledge. Prevalent replay-based methods address this challenge by …
forgetting previous knowledge. Prevalent replay-based methods address this challenge by …
Online continual learning with natural distribution shifts: An empirical study with visual data
Continual learning is the problem of learning and retaining knowledge through time over
multiple tasks and environments. Research has primarily focused on the incremental …
multiple tasks and environments. Research has primarily focused on the incremental …
[HTML][HTML] Continual semi-supervised learning through contrastive interpolation consistency
Continual Learning (CL) investigates how to train Deep Networks on a stream of tasks
without incurring forgetting. CL settings proposed in literature assume that every incoming …
without incurring forgetting. CL settings proposed in literature assume that every incoming …
Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …
The effectiveness of memory replay in large scale continual learning
We study continual learning in the large scale setting where tasks in the input sequence are
not limited to classification, and the outputs can be of high dimension. Among multiple state …
not limited to classification, and the outputs can be of high dimension. Among multiple state …