Towards continual reinforcement learning: A review and perspectives

K Khetarpal, M Riemer, I Rish, D Precup - Journal of Artificial Intelligence …, 2022 - jair.org
In this article, we aim to provide a literature review of different formulations and approaches
to continual reinforcement learning (RL), also known as lifelong or non-stationary RL. We …

Continual learning for recurrent neural networks: an empirical evaluation

A Cossu, A Carta, V Lomonaco, D Bacciu - Neural Networks, 2021 - Elsevier
Learning continuously during all model lifetime is fundamental to deploy machine learning
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …

Class-incremental continual learning into the extended der-verse

M Boschini, L Bonicelli, P Buzzega… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
The staple of human intelligence is the capability of acquiring knowledge in a continuous
fashion. In stark contrast, Deep Networks forget catastrophically and, for this reason, the sub …

Overcoming catastrophic forgetting with hard attention to the task

J Serra, D Suris, M Miron… - … conference on machine …, 2018 - proceedings.mlr.press
Catastrophic forgetting occurs when a neural network loses the information learned in a
previous task after training on subsequent tasks. This problem remains a hurdle for artificial …

Meta-learning representations for continual learning

K Javed, M White - Advances in neural information …, 2019 - proceedings.neurips.cc
The reviews had two major concerns: lack of a benchmarking on a complex dataset, and
unclear writing. To address these two major issues we: 1-Rewrote experiments section with …

The building blocks of a brain-inspired computer

JD Kendall, S Kumar - Applied Physics Reviews, 2020 - pubs.aip.org
Computers have undergone tremendous improvements in performance over the last 60
years, but those improvements have significantly slowed down over the last decade, owing …

Learning to continually learn

S Beaulieu, L Frati, T Miconi, J Lehman, KO Stanley… - ECAI 2020, 2020 - ebooks.iospress.nl
Continual lifelong learning requires an agent or model to learn many sequentially ordered
tasks, building on previous knowledge without catastrophically forgetting it. Much work has …

Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and …

JL McClelland, BL McNaughton… - Psychological review, 1995 - psycnet.apa.org
Damage to the hippocampal system disrupts recent memory but leaves remote memory
intact. The account presented here suggests that memories are first stored via synaptic …

Catastrophic forgetting in connectionist networks

RM French - Trends in cognitive sciences, 1999 - cell.com
All natural cognitive systems, and, in particular, our own, gradually forget previously learned
information. Plausible models of human cognition should therefore exhibit similar patterns of …

Generalisation of regular and irregular morphological patterns

S Prasada, S Pinker - Language and cognitive processes, 1993 - Taylor & Francis
Both regular inflectional patterns (walk-walked) and irregular ones (swing-swung) can be
applied productively to novel words (eg wug-wugged; spling-splung). Theories of generative …