Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
Edge-cloud polarization and collaboration: A comprehensive survey for ai
Influenced by the great success of deep learning via cloud computing and the rapid
development of edge chips, research in artificial intelligence (AI) has shifted to both of the …
development of edge chips, research in artificial intelligence (AI) has shifted to both of the …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Linkless link prediction via relational distillation
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
A sentence speaks a thousand images: Domain generalization through distilling clip with language guidance
Abstract Domain generalization studies the problem of training a model with samples from
several domains (or distributions) and then testing the model with samples from a new …
several domains (or distributions) and then testing the model with samples from a new …
ICD-face: intra-class compactness distillation for face recognition
Abstract Knowledge distillation is an effective model compression method to improve the
performance of a lightweight student model by transferring the knowledge of a well …
performance of a lightweight student model by transferring the knowledge of a well …
Learning compatible embeddings
Achieving backward compatibility when rolling out new models can highly reduce costs or
even bypass feature re-encoding of existing gallery images for in-production visual retrieval …
even bypass feature re-encoding of existing gallery images for in-production visual retrieval …
Amd: Automatic multi-step distillation of large-scale vision models
Transformer-based architectures have become the de-facto standard models for diverse
vision tasks owing to their superior performance. As the size of these transformer-based …
vision tasks owing to their superior performance. As the size of these transformer-based …
A survey of face recognition
X Wang, J Peng, S Zhang, B Chen, Y Wang… - arXiv preprint arXiv …, 2022 - arxiv.org
Recent years witnessed the breakthrough of face recognition with deep convolutional neural
networks. Dozens of papers in the field of FR are published every year. Some of them were …
networks. Dozens of papers in the field of FR are published every year. Some of them were …
Computation-efficient deep learning for computer vision: A survey
Over the past decade, deep learning models have exhibited considerable advancements,
reaching or even exceeding human-level performance in a range of visual perception tasks …
reaching or even exceeding human-level performance in a range of visual perception tasks …