Characterizing and avoiding negative transfer
When labeled data is scarce for a specific target task, transfer learning often offers an
effective solution by utilizing data from a related source task. However, when transferring …
effective solution by utilizing data from a related source task. However, when transferring …
A survey on negative transfer
Transfer learning (TL) utilizes data or knowledge from one or more source domains to
facilitate learning in a target domain. It is particularly useful when the target domain has very …
facilitate learning in a target domain. It is particularly useful when the target domain has very …
Gradient vaccine: Investigating and improving multi-task optimization in massively multilingual models
Massively multilingual models subsuming tens or even hundreds of languages pose great
challenges to multi-task optimization. While it is a common practice to apply a language …
challenges to multi-task optimization. While it is a common practice to apply a language …
On negative interference in multilingual models: Findings and a meta-learning treatment
Modern multilingual models are trained on concatenated text from multiple languages in
hopes of conferring benefits to each (positive transfer), with the most pronounced benefits …
hopes of conferring benefits to each (positive transfer), with the most pronounced benefits …
Efficient meta lifelong-learning with limited memory
Current natural language processing models work well on a single task, yet they often fail to
continuously learn new tasks without forgetting previous ones as they are re-trained …
continuously learn new tasks without forgetting previous ones as they are re-trained …
A novel active multi-source transfer learning algorithm for time series forecasting
Q Gu, Q Dai - Applied Intelligence, 2021 - Springer
Abstract In Time Series Forecasting (TSF), researchers usually assume that there is enough
training data can be obtained, with the old and new data satisfying the same distribution …
training data can be obtained, with the old and new data satisfying the same distribution …
Integrating multi-source transfer learning, active learning and metric learning paradigms for time series prediction
Q Gu, Q Dai, H Yu, R Ye - Applied Soft Computing, 2021 - Elsevier
Abstract Traditional Time Series Prediction (TSP) algorithms assume that the training and
testing data follow the same distribution and a large amount of data can be obtained …
testing data follow the same distribution and a large amount of data can be obtained …
Federated graph learning for cross-domain recommendation
Cross-domain recommendation (CDR) offers a promising solution to the data sparsity
problem by enabling knowledge transfer across source and target domains. However, many …
problem by enabling knowledge transfer across source and target domains. However, many …
Mitigating Negative Transfer in Cross-Domain Recommendation via Knowledge Transferability Enhancement
Z Song, W Zhang, L Deng, J Zhang, Z Wu… - Proceedings of the 30th …, 2024 - dl.acm.org
Cross-Domain Recommendation (CDR) is a promising technique to alleviate data sparsity
by transferring knowledge across domains. However, the negative transfer issue in the …
by transferring knowledge across domains. However, the negative transfer issue in the …
A simple approach to balance task loss in multi-task learning
In multi-task learning, the training losses of different tasks are varying. There are many works
to handle this situation and we classify them into five categories. In this paper, we propose a …
to handle this situation and we classify them into five categories. In this paper, we propose a …