Towards continual reinforcement learning: A review and perspectives
In this article, we aim to provide a literature review of different formulations and approaches
to continual reinforcement learning (RL), also known as lifelong or non-stationary RL. We …
to continual reinforcement learning (RL), also known as lifelong or non-stationary RL. We …
Continual learning for recurrent neural networks: an empirical evaluation
Learning continuously during all model lifetime is fundamental to deploy machine learning
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …
Class-incremental continual learning into the extended der-verse
The staple of human intelligence is the capability of acquiring knowledge in a continuous
fashion. In stark contrast, Deep Networks forget catastrophically and, for this reason, the sub …
fashion. In stark contrast, Deep Networks forget catastrophically and, for this reason, the sub …
Overcoming catastrophic forgetting with hard attention to the task
Catastrophic forgetting occurs when a neural network loses the information learned in a
previous task after training on subsequent tasks. This problem remains a hurdle for artificial …
previous task after training on subsequent tasks. This problem remains a hurdle for artificial …
Meta-learning representations for continual learning
The reviews had two major concerns: lack of a benchmarking on a complex dataset, and
unclear writing. To address these two major issues we: 1-Rewrote experiments section with …
unclear writing. To address these two major issues we: 1-Rewrote experiments section with …
The building blocks of a brain-inspired computer
JD Kendall, S Kumar - Applied Physics Reviews, 2020 - pubs.aip.org
Computers have undergone tremendous improvements in performance over the last 60
years, but those improvements have significantly slowed down over the last decade, owing …
years, but those improvements have significantly slowed down over the last decade, owing …
Learning to continually learn
Continual lifelong learning requires an agent or model to learn many sequentially ordered
tasks, building on previous knowledge without catastrophically forgetting it. Much work has …
tasks, building on previous knowledge without catastrophically forgetting it. Much work has …
Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and …
JL McClelland, BL McNaughton… - Psychological review, 1995 - psycnet.apa.org
Damage to the hippocampal system disrupts recent memory but leaves remote memory
intact. The account presented here suggests that memories are first stored via synaptic …
intact. The account presented here suggests that memories are first stored via synaptic …
Catastrophic forgetting in connectionist networks
RM French - Trends in cognitive sciences, 1999 - cell.com
All natural cognitive systems, and, in particular, our own, gradually forget previously learned
information. Plausible models of human cognition should therefore exhibit similar patterns of …
information. Plausible models of human cognition should therefore exhibit similar patterns of …
Generalisation of regular and irregular morphological patterns
Both regular inflectional patterns (walk-walked) and irregular ones (swing-swung) can be
applied productively to novel words (eg wug-wugged; spling-splung). Theories of generative …
applied productively to novel words (eg wug-wugged; spling-splung). Theories of generative …