Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE Transactions on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

Edge-cloud polarization and collaboration: A comprehensive survey for ai

J Yao, S Zhang, Y Yao, F Wang, J Ma… - … on Knowledge and …, 2022 - ieeexplore.ieee.org
Influenced by the great success of deep learning via cloud computing and the rapid
development of edge chips, research in artificial intelligence (AI) has shifted to both of the …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Linkless link prediction via relational distillation

Z Guo, W Shiao, S Zhang, Y Liu… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …

A sentence speaks a thousand images: Domain generalization through distilling clip with language guidance

Z Huang, A Zhou, Z Ling, M Cai… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Domain generalization studies the problem of training a model with samples from
several domains (or distributions) and then testing the model with samples from a new …

ICD-face: intra-class compactness distillation for face recognition

Z Yu, J Liu, H Qin, Y Wu, K Hu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation is an effective model compression method to improve the
performance of a lightweight student model by transferring the knowledge of a well …

Learning compatible embeddings

Q Meng, C Zhang, X Xu, F Zhou - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Achieving backward compatibility when rolling out new models can highly reduce costs or
even bypass feature re-encoding of existing gallery images for in-production visual retrieval …

Amd: Automatic multi-step distillation of large-scale vision models

C Han, Q Wang, SA Dianat, M Rabbani… - … on Computer Vision, 2025 - Springer
Transformer-based architectures have become the de-facto standard models for diverse
vision tasks owing to their superior performance. As the size of these transformer-based …

A survey of face recognition

X Wang, J Peng, S Zhang, B Chen, Y Wang… - arXiv preprint arXiv …, 2022 - arxiv.org
Recent years witnessed the breakthrough of face recognition with deep convolutional neural
networks. Dozens of papers in the field of FR are published every year. Some of them were …

Computation-efficient deep learning for computer vision: A survey

Y Wang, Y Han, C Wang, S Song… - Cybernetics and …, 2024 - ieeexplore.ieee.org
Over the past decade, deep learning models have exhibited considerable advancements,
reaching or even exceeding human-level performance in a range of visual perception tasks …