Continual learning with tiny episodic memories
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
On tiny episodic memories in continual learning
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
Efficient continual learning with modular networks and task-driven priors
Existing literature in Continual Learning (CL) has focused on overcoming catastrophic
forgetting, the inability of the learner to recall how to perform tasks observed in the past …
forgetting, the inability of the learner to recall how to perform tasks observed in the past …
Beyond not-forgetting: Continual learning with backward knowledge transfer
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve
the learning performance of both a new task andold'tasks by leveraging the forward …
the learning performance of both a new task andold'tasks by leveraging the forward …
Learning to learn without forgetting by maximizing transfer and minimizing interference
Lack of performance when it comes to continual learning over non-stationary distributions of
data remains a major challenge in scaling neural network learning to more human realistic …
data remains a major challenge in scaling neural network learning to more human realistic …
Online continual learning under extreme memory constraints
Continual Learning (CL) aims to develop agents emulating the human ability to sequentially
learn new tasks while being able to retain knowledge obtained from past experiences. In this …
learn new tasks while being able to retain knowledge obtained from past experiences. In this …
Dualprompt: Complementary prompting for rehearsal-free continual learning
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
Memory-efficient semi-supervised continual learning: The world is its own replay buffer
Rehearsal is a critical component for class-incremental continual learning, yet it requires a
substantial memory budget. Our work investigates whether we can significantly reduce this …
substantial memory budget. Our work investigates whether we can significantly reduce this …
Online fast adaptation and knowledge accumulation: a new approach to continual learning
Continual learning studies agents that learn from streams of tasks without forgetting previous
ones while adapting to new ones. Two recent continual-learning scenarios have opened …
ones while adapting to new ones. Two recent continual-learning scenarios have opened …
Gradient-based editing of memory examples for online task-free continual learning
We explore task-free continual learning (CL), in which a model is trained to avoid
catastrophic forgetting in the absence of explicit task boundaries or identities. Among many …
catastrophic forgetting in the absence of explicit task boundaries or identities. Among many …