Magmax: Leveraging model merging for seamless continual learning

D Marczak, B Twardowski, T Trzciński… - European Conference on …, 2025 - Springer
This paper introduces a continual learning approach named MagMax, which utilizes model
merging to enable large pre-trained models to continuously learn from new data without …

No one left behind: Real-world federated class-incremental learning

J Dong, H Li, Y Cong, G Sun, Y Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Federated learning (FL) is a hot collaborative training framework via aggregating model
parameters of decentralized local clients. However, most FL methods unreasonably assume …

A unified approach to domain incremental learning with memory: Theory and algorithm

H Shi, H Wang - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
Abstract Domain incremental learning aims to adapt to a sequence of domains with access
to only a small subset of data (ie, memory) from previous domains. Various methods have …

Weighted ensemble models are strong continual learners

IE Marouf, S Roy, E Tartaglione… - European Conference on …, 2025 - Springer
In this work, we study the problem of continual learning (CL) where the goal is to learn a
model on a sequence of tasks, under the assumption that the data from the previous tasks …

Moboo: Memory-boosted vision transformer for class-incremental learning

B Ni, X Nie, C Zhang, S Xu, X Zhang… - … on Circuits and …, 2024 - ieeexplore.ieee.org
Continual learning strives to acquire knowledge across sequential tasks without forgetting
previously assimilated knowledge. Current state-of-the-art methodologies utilize dynamic …

[HTML][HTML] Less is More: Selective reduction of CT data for self-supervised pre-training of deep learning models with contrastive learning improves downstream …

D Wolf, T Payer, CS Lisson, CG Lisson, M Beer… - Computers in Biology …, 2024 - Elsevier
Background: Self-supervised pre-training of deep learning models with contrastive learning
is a widely used technique in image analysis. Current findings indicate a strong potential for …

Hyper-feature aggregation and relaxed distillation for class incremental learning

R Wu, H Liu, Z Yue, JB Li, CW Sham - Pattern Recognition, 2024 - Elsevier
Although neural networks have been used extensively in pattern recognition scenarios, the
pre-acquisition of datasets is still challenging. In most pattern recognition areas, preparing a …

Advancing autonomy through lifelong learning: a survey of autonomous intelligent systems

D Zhu, Q Bu, Z Zhu, Y Zhang, Z Wang - Frontiers in Neurorobotics, 2024 - frontiersin.org
The combination of lifelong learning algorithms with autonomous intelligent systems (AIS) is
gaining popularity due to its ability to enhance AIS performance, but the existing summaries …

FedProK: Trustworthy Federated Class-Incremental Learning via Prototypical Feature Knowledge Transfer

X Gao, X Yang, H Yu, Y Kang… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Federated Class-Incremental Learning (FCIL) focuses on continually transferring
the previous knowledge to learn new classes in dynamic Federated Learning (FL). However …

Knowledge accumulation in continually learned representations and the issue of feature forgetting

T Hess, E Verwimp, GM van de Ven… - arXiv preprint arXiv …, 2023 - arxiv.org
Continual learning research has shown that neural networks suffer from catastrophic
forgetting" at the output level", but it is debated whether this is also the case at the level of …