A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

[HTML][HTML] Embracing change: Continual learning in deep neural networks

R Hadsell, D Rao, AA Rusu, R Pascanu - Trends in cognitive sciences, 2020 - cell.com
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …

Class-incremental learning: survey and performance evaluation on image classification

M Masana, X Liu, B Twardowski… - … on Pattern Analysis …, 2022 - ieeexplore.ieee.org
For future learning systems, incremental learning is desirable because it allows for: efficient
resource usage by eliminating the need to retrain from scratch at the arrival of new data; …

Online continual learning in image classification: An empirical survey

Z Mai, R Li, J Jeong, D Quispe, H Kim, S Sanner - Neurocomputing, 2022 - Elsevier
Online continual learning for image classification studies the problem of learning to classify
images from an online stream of data and tasks, where tasks may include new classes …

Gdumb: A simple approach that questions our progress in continual learning

A Prabhu, PHS Torr, PK Dokania - … , Glasgow, UK, August 23–28, 2020 …, 2020 - Springer
We discuss a general formulation for the Continual Learning (CL) problem for classification—
a learning task where a stream provides samples to a learner and the goal of the learner …

A continual learning survey: Defying forgetting in classification tasks

M De Lange, R Aljundi, M Masana… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Artificial neural networks thrive in solving the classification problem for a particular rigid task,
acquiring knowledge through generalized learning behaviour from a distinct training phase …

Large scale incremental learning

Y Wu, Y Chen, L Wang, Y Ye, Z Liu… - Proceedings of the …, 2019 - openaccess.thecvf.com
Modern machine learning suffers from catastrophic forgetting when learning new classes
incrementally. The performance dramatically degrades due to the missing data of old …

Dreaming to distill: Data-free knowledge transfer via deepinversion

H Yin, P Molchanov, JM Alvarez, Z Li… - Proceedings of the …, 2020 - openaccess.thecvf.com
We introduce DeepInversion, a new method for synthesizing images from the image
distribution used to train a deep neural network. We" invert" a trained network (teacher) to …

Learning a unified classifier incrementally via rebalancing

S Hou, X Pan, CC Loy, Z Wang… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com
Conventionally, deep neural networks are trained offline, relying on a large dataset
prepared in advance. This paradigm is often challenged in real-world applications, eg online …

End-to-end incremental learning

FM Castro, MJ Marín-Jiménez, N Guil… - Proceedings of the …, 2018 - openaccess.thecvf.com
Although deep learning approaches have stood out in recent years due to their state-of-the-
art results, they continue to suffer from catastrophic forgetting, a dramatic decrease in overall …