Magmax: Leveraging model merging for seamless continual learning
This paper introduces a continual learning approach named MagMax, which utilizes model
merging to enable large pre-trained models to continuously learn from new data without …
merging to enable large pre-trained models to continuously learn from new data without …
No one left behind: Real-world federated class-incremental learning
Federated learning (FL) is a hot collaborative training framework via aggregating model
parameters of decentralized local clients. However, most FL methods unreasonably assume …
parameters of decentralized local clients. However, most FL methods unreasonably assume …
A unified approach to domain incremental learning with memory: Theory and algorithm
Abstract Domain incremental learning aims to adapt to a sequence of domains with access
to only a small subset of data (ie, memory) from previous domains. Various methods have …
to only a small subset of data (ie, memory) from previous domains. Various methods have …
Weighted ensemble models are strong continual learners
In this work, we study the problem of continual learning (CL) where the goal is to learn a
model on a sequence of tasks, under the assumption that the data from the previous tasks …
model on a sequence of tasks, under the assumption that the data from the previous tasks …
Moboo: Memory-boosted vision transformer for class-incremental learning
Continual learning strives to acquire knowledge across sequential tasks without forgetting
previously assimilated knowledge. Current state-of-the-art methodologies utilize dynamic …
previously assimilated knowledge. Current state-of-the-art methodologies utilize dynamic …
[HTML][HTML] Less is More: Selective reduction of CT data for self-supervised pre-training of deep learning models with contrastive learning improves downstream …
D Wolf, T Payer, CS Lisson, CG Lisson, M Beer… - Computers in Biology …, 2024 - Elsevier
Background: Self-supervised pre-training of deep learning models with contrastive learning
is a widely used technique in image analysis. Current findings indicate a strong potential for …
is a widely used technique in image analysis. Current findings indicate a strong potential for …
Hyper-feature aggregation and relaxed distillation for class incremental learning
Although neural networks have been used extensively in pattern recognition scenarios, the
pre-acquisition of datasets is still challenging. In most pattern recognition areas, preparing a …
pre-acquisition of datasets is still challenging. In most pattern recognition areas, preparing a …
Advancing autonomy through lifelong learning: a survey of autonomous intelligent systems
D Zhu, Q Bu, Z Zhu, Y Zhang, Z Wang - Frontiers in Neurorobotics, 2024 - frontiersin.org
The combination of lifelong learning algorithms with autonomous intelligent systems (AIS) is
gaining popularity due to its ability to enhance AIS performance, but the existing summaries …
gaining popularity due to its ability to enhance AIS performance, but the existing summaries …
FedProK: Trustworthy Federated Class-Incremental Learning via Prototypical Feature Knowledge Transfer
Abstract Federated Class-Incremental Learning (FCIL) focuses on continually transferring
the previous knowledge to learn new classes in dynamic Federated Learning (FL). However …
the previous knowledge to learn new classes in dynamic Federated Learning (FL). However …
Knowledge accumulation in continually learned representations and the issue of feature forgetting
T Hess, E Verwimp, GM van de Ven… - arXiv preprint arXiv …, 2023 - arxiv.org
Continual learning research has shown that neural networks suffer from catastrophic
forgetting" at the output level", but it is debated whether this is also the case at the level of …
forgetting" at the output level", but it is debated whether this is also the case at the level of …