Distilling knowledge via knowledge review

P Chen, S Liu, H Zhao, J Jia - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Abstract Knowledge distillation transfers knowledge from the teacher network to the student
one, with the goal of greatly improving the performance of the student network. Previous …

NTIRE 2023 challenge on efficient super-resolution: Methods and results

Y Li, Y Zhang, R Timofte, L Van Gool… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper reviews the NTIRE 2023 challenge on efficient single-image super-resolution
with a focus on the proposed solutions and results. The aim of this challenge is to devise a …

Decoupled knowledge distillation

B Zhao, Q Cui, R Song, Y Qiu… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …

Anomaly detection via reverse distillation from one-class embedding

H Deng, X Li - Proceedings of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) achieves promising results on the challenging problem
of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in …

Point-to-voxel knowledge distillation for lidar semantic segmentation

Y Hou, X Zhu, Y Ma, CC Loy… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
This article addresses the problem of distilling knowledge from a large teacher model to a
slim student network for LiDAR semantic segmentation. Directly employing previous …

Knowledge distillation from a stronger teacher

T Huang, S You, F Wang, C Qian… - Advances in Neural …, 2022 - proceedings.neurips.cc
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …

Focal and global knowledge distillation for detectors

Z Yang, Z Li, X Jiang, Y Gong, Z Yuan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation has been applied to image classification successfully.
However, object detection is much more sophisticated and most knowledge distillation …

Bridging the gap between object and image-level representations for open-vocabulary detection

H Bangalath, M Maaz, MU Khattak… - Advances in …, 2022 - proceedings.neurips.cc
Existing open-vocabulary object detectors typically enlarge their vocabulary sizes by
leveraging different forms of weak supervision. This helps generalize to novel objects at …

Knowledge distillation with the reused teacher classifier

D Chen, JP Mei, H Zhang, C Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …

Cross-image relational knowledge distillation for semantic segmentation

C Yang, H Zhou, Z An, X Jiang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Current Knowledge Distillation (KD) methods for semantic segmentation often
guide the student to mimic the teacher's structured information generated from individual …