Bns: Building network structures dynamically for continual learning

Q Qin, W Hu, H Peng, D Zhao… - Advances in Neural …, 2021 - proceedings.neurips.cc
Continual learning (CL) of a sequence of tasks is often accompanied with the catastrophic
forgetting (CF) problem. Existing research has achieved remarkable results in overcoming …

Continual learning by using information of each class holistically

W Hu, Q Qin, M Wang, J Ma, B Liu - … of the AAAI Conference on Artificial …, 2021 - ojs.aaai.org
Continual learning (CL) incrementally learns a sequence of tasks while solving the
catastrophic forgetting (CF) problem. Existing methods mainly try to deal with CF directly. In …

Achieving forgetting prevention and knowledge transfer in continual learning

Z Ke, B Liu, N Ma, H Xu, L Shu - Advances in Neural …, 2021 - proceedings.neurips.cc
Continual learning (CL) learns a sequence of tasks incrementally with the goal of achieving
two main objectives: overcoming catastrophic forgetting (CF) and encouraging knowledge …

Beyond not-forgetting: Continual learning with backward knowledge transfer

S Lin, L Yang, D Fan, J Zhang - Advances in Neural …, 2022 - proceedings.neurips.cc
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve
the learning performance of both a new task andold'tasks by leveraging the forward …

Efficient continual learning with modular networks and task-driven priors

T Veniat, L Denoyer, MA Ranzato - arXiv preprint arXiv:2012.12631, 2020 - arxiv.org
Existing literature in Continual Learning (CL) has focused on overcoming catastrophic
forgetting, the inability of the learner to recall how to perform tasks observed in the past …

Continual learning of a mixed sequence of similar and dissimilar tasks

Z Ke, B Liu, X Huang - Advances in neural information …, 2020 - proceedings.neurips.cc
Existing research on continual learning of a sequence of tasks focused on dealing with
catastrophic forgetting, where the tasks are assumed to be dissimilar and have little shared …

Parameter-level soft-masking for continual learning

T Konishi, M Kurokawa, C Ono, Z Ke… - International …, 2023 - proceedings.mlr.press
Existing research on task incremental learning in continual learning has primarily focused
on preventing catastrophic forgetting (CF). Although several techniques have achieved …

A multi-head model for continual learning via out-of-distribution replay

G Kim, B Liu, Z Ke - Conference on Lifelong Learning …, 2022 - proceedings.mlr.press
This paper studies class incremental learning (CIL) of continual learning (CL). Many
approaches have been proposed to deal with catastrophic forgetting (CF) in CIL. Most …

Does continual learning equally forget all parameters?

H Zhao, T Zhou, G Long, J Jiang… - … on Machine Learning, 2023 - proceedings.mlr.press
Distribution shift (eg, task or domain shift) in continual learning (CL) usually results in
catastrophic forgetting of previously learned knowledge. Although it can be alleviated by …

Helpful or harmful: Inter-task association in continual learning

H Jin, E Kim - European Conference on Computer Vision, 2022 - Springer
When optimizing sequentially incoming tasks, deep neural networks generally suffer from
catastrophic forgetting due to their lack of ability to maintain knowledge from old tasks. This …