Adapter Incremental Continual Learning of Efficient Audio Spectrogram Transformers
Continual learning involves training neural networks incrementally for new tasks while
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …
Adapter Incremental Continual Learning of Efficient Audio Spectrogram Transformers
Continual learning involves training neural networks incrementally for new tasks while
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …
retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for
sequential tasks with minimal computational resources remains a challenge. In this paper,
we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both
parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce
the trainable parameters without performance degradation for TI-CL, we compare several …
以上显示的是最相近的搜索结果。 查看全部搜索结果