Feddisco: Federated learning with discrepancy-aware collaboration
This work considers the category distribution heterogeneity in federated learning. This issue
is due to biased labeling preferences at multiple clients and is a typical setting of data …
is due to biased labeling preferences at multiple clients and is a typical setting of data …
Revisiting scalarization in multi-task learning: A theoretical perspective
Linear scalarization, ie, combining all loss functions by a weighted sum, has been the
default choice in the literature of multi-task learning (MTL) since its inception. In recent years …
default choice in the literature of multi-task learning (MTL) since its inception. In recent years …
In defense of the unitary scalarization for deep multi-task learning
Recent multi-task learning research argues against unitary scalarization, where training
simply minimizes the sum of the task losses. Several ad-hoc multi-task optimization …
simply minimizes the sum of the task losses. Several ad-hoc multi-task optimization …
Hierarchical prompt learning for multi-task learning
Vision-language models (VLMs) can effectively transfer to various vision tasks via prompt
learning. Real-world scenarios often require adapting a model to multiple similar yet distinct …
learning. Real-world scenarios often require adapting a model to multiple similar yet distinct …
Fame-vil: Multi-tasking vision-language model for heterogeneous fashion tasks
In the fashion domain, there exists a variety of vision-and-language (V+ L) tasks, including
cross-modal retrieval, text-guided image retrieval, multi-modal classification, and image …
cross-modal retrieval, text-guided image retrieval, multi-modal classification, and image …
Auto-lambda: Disentangling dynamic task relationships
Understanding the structure of multiple related tasks allows for multi-task learning to improve
the generalisation ability of one or all of them. However, it usually requires training each …
the generalisation ability of one or all of them. However, it usually requires training each …
Addressing negative transfer in diffusion models
Diffusion-based generative models have achieved remarkable success in various domains.
It trains a shared model on denoising tasks that encompass different noise levels …
It trains a shared model on denoising tasks that encompass different noise levels …
Forkmerge: Mitigating negative transfer in auxiliary-task learning
Abstract Auxiliary-Task Learning (ATL) aims to improve the performance of the target task by
leveraging the knowledge obtained from related tasks. Occasionally, learning multiple tasks …
leveraging the knowledge obtained from related tasks. Occasionally, learning multiple tasks …
Direction-oriented multi-objective learning: Simple and provable stochastic algorithms
Multi-objective optimization (MOO) has become an influential framework in many machine
learning problems with multiple objectives such as learning with multiple criteria and multi …
learning problems with multiple objectives such as learning with multiple criteria and multi …
Libmtl: A python library for deep multi-task learning
This paper presents LibMTL, an open-source Python library built on PyTorch, which
provides a unified, comprehensive, reproducible, and extensible implementation framework …
provides a unified, comprehensive, reproducible, and extensible implementation framework …