Digital twin enhanced federated reinforcement learning with lightweight knowledge distillation in mobile networks
The high-speed mobile networks offer great potentials to many future intelligent applications,
such as autonomous vehicles in smart transportation systems. Such networks provide the …
such as autonomous vehicles in smart transportation systems. Such networks provide the …
A cooperative vehicle-infrastructure system for road hazards detection with edge intelligence
Road hazards (RH) have always been the cause of many serious traffic accidents. These
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …
Lightweight Deep Learning for Resource-Constrained Environments: A Survey
Over the past decade, the dominance of deep learning has prevailed across various
domains of artificial intelligence, including natural language processing, computer vision …
domains of artificial intelligence, including natural language processing, computer vision …
Online knowledge distillation via mutual contrastive learning for visual recognition
The teacher-free online Knowledge Distillation (KD) aims to train an ensemble of multiple
student models collaboratively and distill knowledge from each other. Although existing …
student models collaboratively and distill knowledge from each other. Although existing …
Knowledge condensation distillation
Abstract Knowledge Distillation (KD) transfers the knowledge from a high-capacity teacher
network to strengthen a smaller student. Existing methods focus on excavating the …
network to strengthen a smaller student. Existing methods focus on excavating the …
Adaptive hierarchy-branch fusion for online knowledge distillation
Abstract Online Knowledge Distillation (OKD) is designed to alleviate the dilemma that the
high-capacity pre-trained teacher model is not available. However, the existing methods …
high-capacity pre-trained teacher model is not available. However, the existing methods …
Kd-lightnet: A lightweight network based on knowledge distillation for industrial defect detection
J Liu, H Li, F Zuo, Z Zhao, S Lu - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
At present, the method based on deep learning performs well in public object detection
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …
Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …
Prevailing methods directly forget less important information from an unrobust baseline at …
Pathological image classification via embedded fusion mutual learning
Deep learning models have been widely used in pathological image classification.
However, most researches employ complex but inefficient neural networks to implement this …
However, most researches employ complex but inefficient neural networks to implement this …
Deep cross-layer collaborative learning network for online knowledge distillation
Recent online knowledge distillation (OKD) methods focus on capturing rich and useful
intermediate information by performing multi-layer feature learning. Existing works only …
intermediate information by performing multi-layer feature learning. Existing works only …