Learning with limited annotations: a survey on deep semi-supervised learning for medical image segmentation
Medical image segmentation is a fundamental and critical step in many image-guided
clinical approaches. Recent success of deep learning-based segmentation methods usually …
clinical approaches. Recent success of deep learning-based segmentation methods usually …
Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks
L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …
solving the most complex problem statements. However, these models are huge in size with …
Decoupled knowledge distillation
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
Revisiting weak-to-strong consistency in semi-supervised semantic segmentation
In this work, we revisit the weak-to-strong consistency framework, popularized by FixMatch
from semi-supervised classification, where the prediction of a weakly perturbed image …
from semi-supervised classification, where the prediction of a weakly perturbed image …
R-drop: Regularized dropout for neural networks
Dropout is a powerful and widely used technique to regularize the training of deep neural
networks. Though effective and performing well, the randomness introduced by dropout …
networks. Though effective and performing well, the randomness introduced by dropout …
Model adaptation: Historical contrastive learning for unsupervised domain adaptation without source data
Unsupervised domain adaptation aims to align a labeled source domain and an unlabeled
target domain, but it requires to access the source data which often raises concerns in data …
target domain, but it requires to access the source data which often raises concerns in data …
Communication-efficient federated learning via knowledge distillation
Federated learning is a privacy-preserving machine learning technique to train intelligent
models from decentralized data, which enables exploiting private data by communicating …
models from decentralized data, which enables exploiting private data by communicating …
Stylizednerf: consistent 3d scene stylization as stylized nerf via 2d-3d mutual learning
Abstract 3D scene stylization aims at generating stylized images of the scene from arbitrary
novel views following a given set of style examples, while ensuring consistency when …
novel views following a given set of style examples, while ensuring consistency when …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …