A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
Biological underpinnings for lifelong learning machines
D Kudithipudi, M Aguilar-Simon, J Babb… - Nature Machine …, 2022 - nature.com
Biological organisms learn from interactions with their environment throughout their lifetime.
For artificial systems to successfully act and adapt in the real world, it is desirable to similarly …
For artificial systems to successfully act and adapt in the real world, it is desirable to similarly …
Three types of incremental learning
Incrementally learning new information from a non-stationary stream of data, referred to as
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
A generalist agent
Inspired by progress in large-scale language modeling, we apply a similar approach
towards building a single generalist agent beyond the realm of text outputs. The agent …
towards building a single generalist agent beyond the realm of text outputs. The agent …
Dualprompt: Complementary prompting for rehearsal-free continual learning
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
Deep class-incremental learning: A survey
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
Learn from others and be yourself in heterogeneous federated learning
Federated learning has emerged as an important distributed learning paradigm, which
normally involves collaborative updating with others and local updating on private data …
normally involves collaborative updating with others and local updating on private data …
Learning to prompt for continual learning
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Lst: Ladder side-tuning for parameter and memory efficient transfer learning
Fine-tuning large pre-trained models on downstream tasks has been adopted in a variety of
domains recently. However, it is costly to update the entire parameter set of large pre-trained …
domains recently. However, it is costly to update the entire parameter set of large pre-trained …
Foster: Feature boosting and compression for class-incremental learning
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …
However, deep neural networks suffer from catastrophic forgetting when learning new …