Dynamic support network for few-shot class incremental learning
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022•ieeexplore.ieee.org
Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old
classes and over-fitting new classes. Revealed by our analyses, the problems are caused by
feature distribution crumbling, which leads to class confusion when continuously embedding
few samples to a fixed feature space. In this study, we propose a Dynamic Support Network
(DSN), which refers to an adaptively updating network with compressive node expansion to
“support” the feature space. In each training session, DSN tentatively expands network …
classes and over-fitting new classes. Revealed by our analyses, the problems are caused by
feature distribution crumbling, which leads to class confusion when continuously embedding
few samples to a fixed feature space. In this study, we propose a Dynamic Support Network
(DSN), which refers to an adaptively updating network with compressive node expansion to
“support” the feature space. In each training session, DSN tentatively expands network …
Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old classes and over-fitting new classes. Revealed by our analyses, the problems are caused by feature distribution crumbling, which leads to class confusion when continuously embedding few samples to a fixed feature space. In this study, we propose a Dynamic Support Network (DSN), which refers to an adaptively updating network with compressive node expansion to “support” the feature space. In each training session, DSN tentatively expands network nodes to enlarge feature representation capacity for incremental classes. It then dynamically compresses the expanded network by node self-activation to pursue compact feature representation, which alleviates over-fitting. Simultaneously, DSN selectively recalls old class distributions during incremental learning to support feature distributions and avoid confusion between classes. DSN with compressive node expansion and class distribution recalling provides a systematic solution for the problems of catastrophic forgetting and overfitting. Experiments on CUB, CIFAR-100, and miniImage datasets show that DSN significantly improves upon the baseline approach, achieving new state-of-the-arts.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果