Rethinking few-shot class-incremental learning with open-set hypothesis in hyperbolic geometry
By training first with a large base dataset, Few-Shot Class-Incremental Learning (FSCIL)
aims at continually learning a sequence of few-shot learning tasks with novel classes. There
are mainly two challenges in FSCIL: the overfitting issue of novel classes with limited
labeled samples and the catastrophic forgetting of previously seen classes. The current
protocol of FSCIL is built by mimicking the general class-incremental learning setting by
building a unified framework, while the existing frameworks for FSCIL on this protocol …
aims at continually learning a sequence of few-shot learning tasks with novel classes. There
are mainly two challenges in FSCIL: the overfitting issue of novel classes with limited
labeled samples and the catastrophic forgetting of previously seen classes. The current
protocol of FSCIL is built by mimicking the general class-incremental learning setting by
building a unified framework, while the existing frameworks for FSCIL on this protocol …
By training first with a large base dataset, Few-Shot Class-Incremental Learning (FSCIL) aims at continually learning a sequence of few-shot learning tasks with novel classes. There are mainly two challenges in FSCIL: the overfitting issue of novel classes with limited labeled samples and the catastrophic forgetting of previously seen classes. The current protocol of FSCIL is built by mimicking the general class-incremental learning setting by building a unified framework, while the existing frameworks for FSCIL on this protocol always bias to the classes in the base dataset because the dominant performance of the deep model is decided by the size of the training dataset. Moreover, it is difficult to handle the stability-plasticity constraint in a unified FSCIL framework. To solve these issues, we rethink the configuration of FSCIL with the open-set hypothesis by reserving the possibility in the first session for incoming categories. To find a better decision boundary of close space and open space, Hyperbolic Reciprocal Point Learning module (Hyper-RPL) is built on Reciprocal Point Learning with hyperbolic neural networks. Besides, when learning novel categories from limited labeled data, we incorporate a hyperbolic metric learning (Hyper-Metric) module into the distillation-based framework to alleviate the overfitting issue and better handle the trade-off issue between the preservation of old knowledge and the acquisition of new knowledge. Finally, the comprehensive assessments of the proposed configuration and modules on three benchmark datasets are executed to validate the effectiveness, and state-of-the-art results are achieved.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果