DTKD-Net: Dual-Teacher Knowledge Distillation Lightweight Network for Water-related Optics Image Enhancement
Water-related optics images are often degraded by absorption and scattering effects.
Current underwater image enhancement (UIE) methods improve image quality but neglect …
Current underwater image enhancement (UIE) methods improve image quality but neglect …
DHS-DETR: Efficient DETRs with dynamic head switching
Detection Transformer (DETR) and its variants have emerged a new paradigm to object
detection, but their high computational cost hinders practical applications. By investigating …
detection, but their high computational cost hinders practical applications. By investigating …
Teach-detr: Better training detr with teachers
In this paper, we present a novel training scheme, namely Teach-DETR, to better train DETR-
based detectors from versatile types of teacher detectors. We show that the predicted boxes …
based detectors from versatile types of teacher detectors. We show that the predicted boxes …
Knowledge Distillation via Query Selection for Detection Transformer
Transformers have revolutionized the object detection landscape by introducing DETRs,
acclaimed for their simplicity and efficacy. Despite their advantages, the substantial size of …
acclaimed for their simplicity and efficacy. Despite their advantages, the substantial size of …
LW-DETR: A Transformer Replacement to YOLO for Real-Time Detection
In this paper, we present a light-weight detection transformer, LW-DETR, which outperforms
YOLOs for real-time object detection. The architecture is a simple stack of a ViT encoder, a …
YOLOs for real-time object detection. The architecture is a simple stack of a ViT encoder, a …
OD-DETR: Online Distillation for Stabilizing Training of Detection Transformer
DEtection TRansformer (DETR) becomes a dominant paradigm, mainly due to its common
architecture with high accuracy and no post-processing. However, DETR suffers from …
architecture with high accuracy and no post-processing. However, DETR suffers from …
Efficient transformer-based panoptic segmentation via knowledge distillation
W Zhang - 2023 - ideals.illinois.edu
Abstract Knowledge distillation has been applied to various models in different domains.
However, knowledge distillation on panoptic segmentation has not been studied so far. In …
However, knowledge distillation on panoptic segmentation has not been studied so far. In …