A comprehensive survey on test-time adaptation under distribution shifts

J Liang, R He, T Tan - International Journal of Computer Vision, 2024 - Springer
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …

Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE Transactions on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

Fine-tuning global model via data-free knowledge distillation for non-iid federated learning

L Zhang, L Shen, L Ding, D Tao… - Proceedings of the …, 2022 - openaccess.thecvf.com
Federated Learning (FL) is an emerging distributed learning paradigm under privacy
constraint. Data heterogeneity is one of the main challenges in FL, which results in slow …

Large language models are reasoning teachers

N Ho, L Schmid, SY Yun - arXiv preprint arXiv:2212.10071, 2022 - arxiv.org
Recent works have shown that chain-of-thought (CoT) prompting can elicit language models
to solve complex reasoning tasks, step-by-step. However, prompt-based CoT methods are …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Ensemble distillation for robust model fusion in federated learning

T Lin, L Kong, SU Stich, M Jaggi - Advances in neural …, 2020 - proceedings.neurips.cc
Federated Learning (FL) is a machine learning setting where many devices collaboratively
train a machine learning model while keeping the training data decentralized. In most of the …

A survey on efficient convolutional neural networks and hardware acceleration

D Ghimire, D Kil, S Kim - Electronics, 2022 - mdpi.com
Over the past decade, deep-learning-based representations have demonstrated remarkable
performance in academia and industry. The learning capability of convolutional neural …

Source-free domain adaptation for semantic segmentation

Y Liu, W Zhang, J Wang - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
Abstract Unsupervised Domain Adaptation (UDA) can tackle the challenge that
convolutional neural network (CNN)-based approaches for semantic segmentation heavily …

Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

Dataset condensation with gradient matching

B Zhao, KR Mopuri, H Bilen - arXiv preprint arXiv:2006.05929, 2020 - arxiv.org
As the state-of-the-art machine learning methods in many fields rely on larger datasets,
storing datasets and training models on them become significantly more expensive. This …