Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

A comprehensive study of class incremental learning algorithms for visual tasks

E Belouadah, A Popescu, I Kanellos - Neural Networks, 2021 - Elsevier
The ability of artificial agents to increment their capabilities when confronted with new data is
an open challenge in artificial intelligence. The main challenge faced in such cases is …

Dytox: Transformers for continual learning with dynamic token expansion

A Douillard, A Ramé, G Couairon… - Proceedings of the …, 2022 - openaccess.thecvf.com
Deep network architectures struggle to continually learn new tasks without forgetting the
previous tasks. A recent trend indicates that dynamic architectures based on an expansion …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Maintaining discrimination and fairness in class incremental learning

B Zhao, X Xiao, G Gan, B Zhang… - Proceedings of the …, 2020 - openaccess.thecvf.com
Deep neural networks (DNNs) have been applied in class incremental learning, which aims
to solve common real-world problems of learning new classes continually. One drawback of …

Plop: Learning without forgetting for continual semantic segmentation

A Douillard, Y Chen, A Dapogny… - Proceedings of the …, 2021 - openaccess.thecvf.com
Deep learning approaches are nowadays ubiquitously used to tackle computer vision tasks
such as semantic segmentation, requiring large datasets and substantial computational …

Incremental learning techniques for semantic segmentation

U Michieli, P Zanuttigh - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com
Deep learning architectures exhibit a critical drop of performance due to catastrophic
forgetting when they are required to incrementally learn new tasks. Contemporary …

There is more than meets the eye: Self-supervised multi-object detection and tracking with sound by distilling multimodal knowledge

FR Valverde, JV Hurtado… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Attributes of sound inherent to objects can provide valuable cues to learn rich
representations for object detection and tracking. Furthermore, the co-occurrence of …

More classifiers, less forgetting: A generic multi-classifier paradigm for incremental learning

Y Liu, S Parisot, G Slabaugh, X Jia, A Leonardis… - Computer Vision–ECCV …, 2020 - Springer
Overcoming catastrophic forgetting in neural networks is a long-standing and core research
objective for incremental learning. Notable studies have shown regularization strategies …

Contrastive deep supervision

L Zhang, X Chen, J Zhang, R Dong, K Ma - European Conference on …, 2022 - Springer
The success of deep learning is usually accompanied by the growth in neural network
depth. However, the traditional training method only supervises the neural network at its last …