Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks
L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …
solving the most complex problem statements. However, these models are huge in size with …
A brief survey on semantic segmentation with deep learning
S Hao, Y Zhou, Y Guo - Neurocomputing, 2020 - Elsevier
Semantic segmentation is a challenging task in computer vision. In recent years, the
performance of semantic segmentation has been greatly improved by using deep learning …
performance of semantic segmentation has been greatly improved by using deep learning …
Cross-image relational knowledge distillation for semantic segmentation
Abstract Current Knowledge Distillation (KD) methods for semantic segmentation often
guide the student to mimic the teacher's structured information generated from individual …
guide the student to mimic the teacher's structured information generated from individual …
Simcvd: Simple contrastive voxel-wise representation distillation for semi-supervised medical image segmentation
Automated segmentation in medical image analysis is a challenging task that requires a
large amount of manually labeled data. However, most existing learning-based approaches …
large amount of manually labeled data. However, most existing learning-based approaches …
Channel-wise knowledge distillation for dense prediction
Abstract Knowledge distillation (KD) has been proven a simple and effective tool for training
compact dense prediction models. Lightweight student networks are trained by extra …
compact dense prediction models. Lightweight student networks are trained by extra …
Structured knowledge distillation for semantic segmentation
In this paper, we investigate the issue of knowledge distillation for training compact semantic
segmentation networks by making use of cumbersome networks. We start from the …
segmentation networks by making use of cumbersome networks. We start from the …
WaveNet: Wavelet network with knowledge distillation for RGB-T salient object detection
In recent years, various neural network architectures for computer vision have been devised,
such as the visual transformer and multilayer perceptron (MLP). A transformer based on an …
such as the visual transformer and multilayer perceptron (MLP). A transformer based on an …
Querying labeled for unlabeled: Cross-image semantic consistency guided semi-supervised semantic segmentation
Semi-supervised semantic segmentation aims to learn a semantic segmentation model via
limited labeled images and adequate unlabeled images. The key to this task is generating …
limited labeled images and adequate unlabeled images. The key to this task is generating …
Efficient medical image segmentation based on knowledge distillation
Recent advances have been made in applying convolutional neural networks to achieve
more precise prediction results for medical image segmentation problems. However, the …
more precise prediction results for medical image segmentation problems. However, the …
Structural and statistical texture knowledge distillation for semantic segmentation
Existing knowledge distillation works for semantic segmentation mainly focus on transfering
high-level contextual knowledge from teacher to student. However, low-level texture …
high-level contextual knowledge from teacher to student. However, low-level texture …