[HTML][HTML] A survey on few-shot class-incremental learning
Large deep learning models are impressive, but they struggle when real-time data is not
available. Few-shot class-incremental learning (FSCIL) poses a significant challenge for …
available. Few-shot class-incremental learning (FSCIL) poses a significant challenge for …
A comprehensive survey of forgetting in deep learning beyond continual learning
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
[HTML][HTML] Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learning
Abstract Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes
incrementally with a limited number of samples per class. It faces issues of forgetting …
incrementally with a limited number of samples per class. It faces issues of forgetting …
Few-shot learning for image denoising
Deep Neural Networks (DNNs) have achieved impressive results on the task of image
denoising, but there are two serious problems. First, the denoising ability of DNNs-based …
denoising, but there are two serious problems. First, the denoising ability of DNNs-based …
Neural collapse terminus: A unified solution for class incremental learning and its variants
How to enable learnability for new classes while keeping the capability well on old classes
has been a crucial challenge for class incremental learning. Beyond the normal case, long …
has been a crucial challenge for class incremental learning. Beyond the normal case, long …
Learning prompt with distribution-based feature replay for few-shot class-incremental learning
Few-shot Class-Incremental Learning (FSCIL) aims to continuously learn new classes
based on very limited training data without forgetting the old ones encountered. Existing …
based on very limited training data without forgetting the old ones encountered. Existing …
Few-shot class-incremental audio classification using dynamically expanded classifier with self-attention modified prototypes
Most existing methods for audio classification assume that the vocabulary of audio classes
to be classified is fixed. When novel (unseen) audio classes appear, audio classification …
to be classified is fixed. When novel (unseen) audio classes appear, audio classification …
[HTML][HTML] Pseudo-set frequency refinement architecture for fine-grained few-shot class-incremental learning
Few-shot class-incremental learning was introduced to solve the model adaptation problem
for new incremental classes with only a few examples while still remaining effective for old …
for new incremental classes with only a few examples while still remaining effective for old …
Few-shot classification with fork attention adapter
J Sun, J Li - Pattern Recognition, 2024 - Elsevier
Few-shot learning aims to transfer the knowledge learned from seen categories to unseen
categories with a few references. It is also an essential challenge to bridge the gap between …
categories with a few references. It is also an essential challenge to bridge the gap between …
Towards Continual Learning Desiderata via HSIC-Bottleneck Orthogonalization and Equiangular Embedding
D Li, T Wang, J Chen, Q Ren, K Kawaguchi… - Proceedings of the AAAI …, 2024 - ojs.aaai.org
Deep neural networks are susceptible to catastrophic forgetting when trained on sequential
tasks. Various continual learning (CL) methods often rely on exemplar buffers or/and …
tasks. Various continual learning (CL) methods often rely on exemplar buffers or/and …