Beyond not-forgetting: Continual learning with backward knowledge transfer
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve
the learning performance of both a new task andold'tasks by leveraging the forward …
the learning performance of both a new task andold'tasks by leveraging the forward …
Bns: Building network structures dynamically for continual learning
Continual learning (CL) of a sequence of tasks is often accompanied with the catastrophic
forgetting (CF) problem. Existing research has achieved remarkable results in overcoming …
forgetting (CF) problem. Existing research has achieved remarkable results in overcoming …
Sub-network Discovery and Soft-masking for Continual Learning of Mixed Tasks
Continual learning (CL) has two main objectives: preventing catastrophic forgetting (CF) and
encouraging knowledge transfer (KT). The existing literature mainly focused on overcoming …
encouraging knowledge transfer (KT). The existing literature mainly focused on overcoming …
Learning invariant representation for continual learning
Continual learning aims to provide intelligent agents that are capable of learning continually
a sequence of tasks, building on previously learned knowledge. A key challenge in this …
a sequence of tasks, building on previously learned knowledge. A key challenge in this …
Parameter-level soft-masking for continual learning
Existing research on task incremental learning in continual learning has primarily focused
on preventing catastrophic forgetting (CF). Although several techniques have achieved …
on preventing catastrophic forgetting (CF). Although several techniques have achieved …
On tiny episodic memories in continual learning
In continual learning (CL), an agent learns from a stream of tasks leveraging prior
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
experience to transfer knowledge to future tasks. It is an ideal framework to decrease the …
Does continual learning equally forget all parameters?
Distribution shift (eg, task or domain shift) in continual learning (CL) usually results in
catastrophic forgetting of previously learned knowledge. Although it can be alleviated by …
catastrophic forgetting of previously learned knowledge. Although it can be alleviated by …
Self-attention meta-learner for continual learning
Continual learning aims to provide intelligent agents capable of learning multiple tasks
sequentially with neural networks. One of its main challenging, catastrophic forgetting, is …
sequentially with neural networks. One of its main challenging, catastrophic forgetting, is …
Achieving a better stability-plasticity trade-off via auxiliary networks in continual learning
In contrast to the natural capabilities of humans to learn new tasks in a sequential fashion,
neural networks are known to suffer from catastrophic forgetting, where the model's …
neural networks are known to suffer from catastrophic forgetting, where the model's …
Generative feature replay with orthogonal weight modification for continual learning
The ability of intelligent agents to learn and remember multiple tasks sequentially is crucial
to achieving artificial general intelligence. Many continual learning (CL) methods have been …
to achieving artificial general intelligence. Many continual learning (CL) methods have been …
相关搜索
- continual learning knowledge transfer
- continual learning feature replay
- continual learning mixed tasks
- continual learning weight modification
- continual learning network structures
- continual learning parameter level
- continual learning stability plasticity
- continual learning self attention
- continual learning meta learner