Memory-efficient semi-supervised continual learning: The world is its own replay buffer

J Smith, J Balloch, YC Hsu, Z Kira - 2021 International Joint …, 2021 - ieeexplore.ieee.org
Rehearsal is a critical component for class-incremental continual learning, yet it requires a
substantial memory budget. Our work investigates whether we can significantly reduce this …

CUCL: Codebook for Unsupervised Continual Learning

C Cheng, J Song, X Zhu, J Zhu, L Gao… - Proceedings of the 31st …, 2023 - dl.acm.org
The focus of this study is on Unsupervised Continual Learning (UCL), as it presents an
alternative to Supervised Continual Learning which needs high-quality manual labeled data …

Representational continuity for unsupervised continual learning

D Madaan, J Yoon, Y Li, Y Liu, SJ Hwang - arXiv preprint arXiv …, 2021 - arxiv.org
Continual learning (CL) aims to learn a sequence of tasks without forgetting the previously
acquired knowledge. However, recent CL advances are restricted to supervised continual …

Ordisco: Effective and efficient usage of incremental unlabeled data for semi-supervised continual learning

L Wang, K Yang, C Li, L Hong… - Proceedings of the …, 2021 - openaccess.thecvf.com
Continual learning usually assumes the incoming data are fully labeled, which might not be
applicable in real applications. In this work, we consider semi-supervised continual learning …

On tiny episodic memories in continual learning

A Chaudhry, M Rohrbach, M Elhoseiny… - arXiv preprint arXiv …, 2019 - arxiv.org
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …

Regularizing second-order influences for continual learning

Z Sun, Y Mu, G Hua - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Continual learning aims to learn on non-stationary data streams without catastrophically
forgetting previous knowledge. Prevalent replay-based methods address this challenge by …

Online continual learning with natural distribution shifts: An empirical study with visual data

Z Cai, O Sener, V Koltun - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Continual learning is the problem of learning and retaining knowledge through time over
multiple tasks and environments. Research has primarily focused on the incremental …

[HTML][HTML] Continual semi-supervised learning through contrastive interpolation consistency

M Boschini, P Buzzega, L Bonicelli, A Porrello… - Pattern Recognition …, 2022 - Elsevier
Continual Learning (CL) investigates how to train Deep Networks on a stream of tasks
without incurring forgetting. CL settings proposed in literature assume that every incoming …

Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning

D Goswami, Y Liu, B Twardowski… - Advances in Neural …, 2024 - proceedings.neurips.cc
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …

The effectiveness of memory replay in large scale continual learning

Y Balaji, M Farajtabar, D Yin, A Mott, A Li - arXiv preprint arXiv:2010.02418, 2020 - arxiv.org
We study continual learning in the large scale setting where tasks in the input sequence are
not limited to classification, and the outputs can be of high dimension. Among multiple state …