CLIP-KD: An Empirical Study of Distilling CLIP Models
CLIP has become a promising language-supervised visual pre-training framework and
achieves excellent performance over a wide range of tasks. This paper aims to distill small …
achieves excellent performance over a wide range of tasks. This paper aims to distill small …
Learning from human educational wisdom: A student-centered knowledge distillation method
Existing studies on knowledge distillation typically focus on teacher-centered methods, in
which the teacher network is trained according to its own standards before transferring the …
which the teacher network is trained according to its own standards before transferring the …
Remaining useful life prediction across machines using multi-source adversarial online knowledge distillation
K Liu, Y Li - Engineering Applications of Artificial Intelligence, 2024 - Elsevier
Deep transfer learning has been extensively developed in the remaining useful life
prediction of rolling bearings because it can decrease the dependence on massive labeled …
prediction of rolling bearings because it can decrease the dependence on massive labeled …
CLIP-KD: An Empirical Study of CLIP Model Distillation
Abstract Contrastive Language-Image Pre-training (CLIP) has become a promising
language-supervised visual pre-training framework. This paper aims to distill small CLIP …
language-supervised visual pre-training framework. This paper aims to distill small CLIP …
Efficient masked autoencoders with self-consistency
Inspired by the masked language modeling (MLM) in natural language processing tasks, the
masked image modeling (MIM) has been recognized as a strong self-supervised pre …
masked image modeling (MIM) has been recognized as a strong self-supervised pre …
A voice spoofing detection framework for IoT systems with feature pyramid and online knowledge distillation
Y Ren, H Peng, L Li, X Xue, Y Lan, Y Yang - Journal of Systems …, 2023 - Elsevier
Voice anti-spoofing is an important step for secure speaker verification in voice-enabled
Internet of Things (IoT) systems. Most voice spoofing detection methods require significant …
Internet of Things (IoT) systems. Most voice spoofing detection methods require significant …
A survey of historical learning: Learning models with learning history
New knowledge originates from the old. The various types of elements, deposited in the
training history, are a large amount of wealth for improving learning deep models. In this …
training history, are a large amount of wealth for improving learning deep models. In this …
PURF: Improving teacher representations by imposing smoothness constraints for knowledge distillation
Abstract Knowledge distillation is one of the most persuasive approaches to model
compression that transfers the representational expertise from large deep-learning teacher …
compression that transfers the representational expertise from large deep-learning teacher …
Online_XKD: An online knowledge distillation model for underwater object detection
X Chen, X Chen, F Wu, H Wang, H Yao - Computers and Electrical …, 2024 - Elsevier
Underwater object detection in the field of computer vision faces unique challenges such as
color distortion, reduced visibility, and blurred edges caused by aquatic conditions, which …
color distortion, reduced visibility, and blurred edges caused by aquatic conditions, which …
SAMCL: Subgraph-Aligned Multiview Contrastive Learning for Graph Anomaly Detection
J Hu, B Xiao, H Jin, J Duan, S Wang… - … on Neural Networks …, 2023 - ieeexplore.ieee.org
Graph anomaly detection (GAD) has gained increasing attention in various attribute graph
applications, ie, social communication and financial fraud transaction networks. Recently …
applications, ie, social communication and financial fraud transaction networks. Recently …