Digital twin enhanced federated reinforcement learning with lightweight knowledge distillation in mobile networks

X Zhou, X Zheng, X Cui, J Shi, W Liang… - IEEE Journal on …, 2023 - ieeexplore.ieee.org
The high-speed mobile networks offer great potentials to many future intelligent applications,
such as autonomous vehicles in smart transportation systems. Such networks provide the …

A cooperative vehicle-infrastructure system for road hazards detection with edge intelligence

C Chen, G Yao, L Liu, Q Pei, H Song… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Road hazards (RH) have always been the cause of many serious traffic accidents. These
have posed a threat to the safety of drivers, passengers, and pedestrians, and have also …

Lightweight Deep Learning for Resource-Constrained Environments: A Survey

HI Liu, M Galindo, H Xie, LK Wong, HH Shuai… - ACM Computing …, 2024 - dl.acm.org
Over the past decade, the dominance of deep learning has prevailed across various
domains of artificial intelligence, including natural language processing, computer vision …

Online knowledge distillation via mutual contrastive learning for visual recognition

C Yang, Z An, H Zhou, F Zhuang, Y Xu… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The teacher-free online Knowledge Distillation (KD) aims to train an ensemble of multiple
student models collaboratively and distill knowledge from each other. Although existing …

Knowledge condensation distillation

C Li, M Lin, Z Ding, N Lin, Y Zhuang, Y Huang… - … on Computer Vision, 2022 - Springer
Abstract Knowledge Distillation (KD) transfers the knowledge from a high-capacity teacher
network to strengthen a smaller student. Existing methods focus on excavating the …

Adaptive hierarchy-branch fusion for online knowledge distillation

L Gong, S Lin, B Zhang, Y Shen, K Li, R Qiao… - Proceedings of the …, 2023 - ojs.aaai.org
Abstract Online Knowledge Distillation (OKD) is designed to alleviate the dilemma that the
high-capacity pre-trained teacher model is not available. However, the existing methods …

Kd-lightnet: A lightweight network based on knowledge distillation for industrial defect detection

J Liu, H Li, F Zuo, Z Zhao, S Lu - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
At present, the method based on deep learning performs well in public object detection
tasks. However, there are still two problems to be solved for industrial defect detection: 1) …

Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning

X Zhang, W Xie, Y Li, K Jiang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …

Pathological image classification via embedded fusion mutual learning

G Li, G Wu, G Xu, C Li, Z Zhu, Y Ye, H Zhang - … Signal Processing and …, 2023 - Elsevier
Deep learning models have been widely used in pathological image classification.
However, most researches employ complex but inefficient neural networks to implement this …

Deep cross-layer collaborative learning network for online knowledge distillation

T Su, Q Liang, J Zhang, Z Yu, Z Xu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Recent online knowledge distillation (OKD) methods focus on capturing rich and useful
intermediate information by performing multi-layer feature learning. Existing works only …