Deep model reassembly

X Yang, D Zhou, S Liu, J Ye… - Advances in neural …, 2022 - proceedings.neurips.cc
In this paper, we explore a novel knowledge-transfer task, termed as Deep Model
Reassembly (DeRy), for general-purpose model reuse. Given a collection of heterogeneous …

Transfer learning for radio frequency machine learning: a taxonomy and survey

LJ Wong, AJ Michaels - Sensors, 2022 - mdpi.com
Transfer learning is a pervasive technology in computer vision and natural language
processing fields, yielding exponential performance improvements by leveraging prior …

Logme: Practical assessment of pre-trained models for transfer learning

K You, Y Liu, J Wang, M Long - International Conference on …, 2021 - proceedings.mlr.press
This paper studies task adaptive pre-trained model selection, an underexplored problem of
assessing pre-trained models for the target task and select best ones from the model …

A survey on negative transfer

W Zhang, L Deng, L Zhang, D Wu - IEEE/CAA Journal of …, 2022 - ieeexplore.ieee.org
Transfer learning (TL) utilizes data or knowledge from one or more source domains to
facilitate learning in a target domain. It is particularly useful when the target domain has very …

Transferability in deep learning: A survey

J Jiang, Y Shu, J Wang, M Long - arXiv preprint arXiv:2201.05867, 2022 - arxiv.org
The success of deep learning algorithms generally depends on large-scale data, while
humans appear to have inherent ability of knowledge transfer, by recognizing and applying …

How far pre-trained models are from neural collapse on the target dataset informs their transferability

Z Wang, Y Luo, L Zheng, Z Huang… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper focuses on model transferability estimation, ie, assessing the performance of pre-
trained models on a downstream task without performing fine-tuning. Motivated by the …

Enabling all in-edge deep learning: A literature review

P Joshi, M Hasanuzzaman, C Thapa, H Afli… - IEEE Access, 2023 - ieeexplore.ieee.org
In recent years, deep learning (DL) models have demonstrated remarkable achievements
on non-trivial tasks such as speech recognition, image processing, and natural language …

Transferability estimation using bhattacharyya class separability

M Pándy, A Agostinelli, J Uijlings… - Proceedings of the …, 2022 - openaccess.thecvf.com
Transfer learning has become a popular method for leveraging pre-trained models in
computer vision. However, without performing computationally expensive fine-tuning, it is …

Sensitivity-aware visual parameter-efficient fine-tuning

H He, J Cai, J Zhang, D Tao… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Visual Parameter-Efficient Fine-Tuning (PEFT) has become a powerful alternative
for full fine-tuning so as to adapt pre-trained vision models to downstream tasks, which only …

What to pre-train on? efficient intermediate task selection

C Poth, J Pfeiffer, A Rücklé, I Gurevych - arXiv preprint arXiv:2104.08247, 2021 - arxiv.org
Intermediate task fine-tuning has been shown to culminate in large transfer gains across
many NLP tasks. With an abundance of candidate datasets as well as pre-trained language …