A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
A comprehensive survey of forgetting in deep learning beyond continual learning
Forgetting refers to the loss or deterioration of previously acquired information or knowledge.
While the existing surveys on forgetting have primarily focused on continual learning …
While the existing surveys on forgetting have primarily focused on continual learning …
An empirical study of catastrophic forgetting in large language models during continual fine-tuning
Catastrophic forgetting (CF) is a phenomenon that occurs in machine learning when a
model forgets previously learned information while acquiring new knowledge. As large …
model forgets previously learned information while acquiring new knowledge. As large …
Mips-fusion: Multi-implicit-submaps for scalable and robust online neural rgb-d reconstruction
We introduce MIPS-Fusion, a robust and scalable online RGB-D reconstruction method
based on a novel neural implicit representation-multi-implicit-submap. Different from existing …
based on a novel neural implicit representation-multi-implicit-submap. Different from existing …
Continual Pre-Training of Large Language Models: How to (re) warm your model?
Large language models (LLMs) are routinely pre-trained on billions of tokens, only to restart
the process over again once new data becomes available. A much cheaper and more …
the process over again once new data becomes available. A much cheaper and more …
A comprehensive empirical evaluation on online continual learning
Online continual learning aims to get closer to a live learning experience by learning directly
on a stream of data with temporally shifting distribution and by storing a minimum amount of …
on a stream of data with temporally shifting distribution and by storing a minimum amount of …
[HTML][HTML] Continual pre-training mitigates forgetting in language and vision
Pre-trained models are commonly used in Continual Learning to initialize the model before
training on the stream of non-stationary data. However, pre-training is rarely applied during …
training on the stream of non-stationary data. However, pre-training is rarely applied during …
First session adaptation: A strong replay-free baseline for class-incremental learning
A Panos, Y Kobe, DO Reino… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract In Class-Incremental Learning (CIL) an image classification system is exposed to
new classes in each learning session and must be updated incrementally. Methods …
new classes in each learning session and must be updated incrementally. Methods …
Prototype-sample relation distillation: towards replay-free continual learning
In Continual learning (CL) balancing effective adaptation while combating catastrophic
forgetting is a central challenge. Many of the recent best-performing methods utilize various …
forgetting is a central challenge. Many of the recent best-performing methods utilize various …
Grow and merge: A unified framework for continuous categories discovery
X Zhang, J Jiang, Y Feng, ZF Wu… - Advances in …, 2022 - proceedings.neurips.cc
Although a number of studies are devoted to novel category discovery, most of them assume
a static setting where both labeled and unlabeled data are given at once for finding new …
a static setting where both labeled and unlabeled data are given at once for finding new …