DTKD-Net: Dual-Teacher Knowledge Distillation Lightweight Network for Water-related Optics Image Enhancement

J Zhou, B Zhang, D Zhang, G Vivone… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Water-related optics images are often degraded by absorption and scattering effects.
Current underwater image enhancement (UIE) methods improve image quality but neglect …

DHS-DETR: Efficient DETRs with dynamic head switching

H Chen, C Tang, X Hu - Computer Vision and Image Understanding, 2024 - Elsevier
Detection Transformer (DETR) and its variants have emerged a new paradigm to object
detection, but their high computational cost hinders practical applications. By investigating …

Teach-detr: Better training detr with teachers

L Huang, K Lu, G Song, L Wang, S Liu… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In this paper, we present a novel training scheme, namely Teach-DETR, to better train DETR-
based detectors from versatile types of teacher detectors. We show that the predicted boxes …

Knowledge Distillation via Query Selection for Detection Transformer

Y Liu, L Wang, Z Tang, Y Liao, Y Sun, L Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
Transformers have revolutionized the object detection landscape by introducing DETRs,
acclaimed for their simplicity and efficacy. Despite their advantages, the substantial size of …

LW-DETR: A Transformer Replacement to YOLO for Real-Time Detection

Q Chen, X Su, X Zhang, J Wang, J Chen… - arXiv preprint arXiv …, 2024 - arxiv.org
In this paper, we present a light-weight detection transformer, LW-DETR, which outperforms
YOLOs for real-time object detection. The architecture is a simple stack of a ViT encoder, a …

OD-DETR: Online Distillation for Stabilizing Training of Detection Transformer

S Wu, L Sun, Q Li - arXiv preprint arXiv:2406.05791, 2024 - arxiv.org
DEtection TRansformer (DETR) becomes a dominant paradigm, mainly due to its common
architecture with high accuracy and no post-processing. However, DETR suffers from …

Efficient transformer-based panoptic segmentation via knowledge distillation

W Zhang - 2023 - ideals.illinois.edu
Abstract Knowledge distillation has been applied to various models in different domains.
However, knowledge distillation on panoptic segmentation has not been studied so far. In …