Adapter Incremental Continual Learning of Efficient Audio Spectrogram Transformers

NM Selvaraj, X Guo, A Kong, B Shen, A Kot - arXiv preprint arXiv …, 2023 - arxiv.org
Continual learning involves training neural networks incrementally for new tasks while
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …

Adapter Incremental Continual Learning of Efficient Audio Spectrogram Transformers

N Muthuchamy Selvaraj, X Guo, A Kong… - arXiv e …, 2023 - ui.adsabs.harvard.edu
Continual learning involves training neural networks incrementally for new tasks while
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …
以上显示的是最相近的搜索结果。 查看全部搜索结果