Feddisco: Federated learning with discrepancy-aware collaboration

R Ye, M Xu, J Wang, C Xu, S Chen… - … on Machine Learning, 2023 - proceedings.mlr.press
This work considers the category distribution heterogeneity in federated learning. This issue
is due to biased labeling preferences at multiple clients and is a typical setting of data …

Revisiting scalarization in multi-task learning: A theoretical perspective

Y Hu, R Xian, Q Wu, Q Fan, L Yin… - Advances in Neural …, 2024 - proceedings.neurips.cc
Linear scalarization, ie, combining all loss functions by a weighted sum, has been the
default choice in the literature of multi-task learning (MTL) since its inception. In recent years …

In defense of the unitary scalarization for deep multi-task learning

V Kurin, A De Palma, I Kostrikov… - Advances in …, 2022 - proceedings.neurips.cc
Recent multi-task learning research argues against unitary scalarization, where training
simply minimizes the sum of the task losses. Several ad-hoc multi-task optimization …

Hierarchical prompt learning for multi-task learning

Y Liu, Y Lu, H Liu, Y An, Z Xu, Z Yao… - Proceedings of the …, 2023 - openaccess.thecvf.com
Vision-language models (VLMs) can effectively transfer to various vision tasks via prompt
learning. Real-world scenarios often require adapting a model to multiple similar yet distinct …

Fame-vil: Multi-tasking vision-language model for heterogeneous fashion tasks

X Han, X Zhu, L Yu, L Zhang… - Proceedings of the …, 2023 - openaccess.thecvf.com
In the fashion domain, there exists a variety of vision-and-language (V+ L) tasks, including
cross-modal retrieval, text-guided image retrieval, multi-modal classification, and image …

Auto-lambda: Disentangling dynamic task relationships

S Liu, S James, AJ Davison, E Johns - arXiv preprint arXiv:2202.03091, 2022 - arxiv.org
Understanding the structure of multiple related tasks allows for multi-task learning to improve
the generalisation ability of one or all of them. However, it usually requires training each …

Addressing negative transfer in diffusion models

H Go, Y Lee, S Lee, S Oh, H Moon… - Advances in Neural …, 2024 - proceedings.neurips.cc
Diffusion-based generative models have achieved remarkable success in various domains.
It trains a shared model on denoising tasks that encompass different noise levels …

Forkmerge: Mitigating negative transfer in auxiliary-task learning

J Jiang, B Chen, J Pan, X Wang, D Liu… - Advances in …, 2024 - proceedings.neurips.cc
Abstract Auxiliary-Task Learning (ATL) aims to improve the performance of the target task by
leveraging the knowledge obtained from related tasks. Occasionally, learning multiple tasks …

Direction-oriented multi-objective learning: Simple and provable stochastic algorithms

P Xiao, H Ban, K Ji - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Multi-objective optimization (MOO) has become an influential framework in many machine
learning problems with multiple objectives such as learning with multiple criteria and multi …

Libmtl: A python library for deep multi-task learning

B Lin, Y Zhang - The Journal of Machine Learning Research, 2023 - dl.acm.org
This paper presents LibMTL, an open-source Python library built on PyTorch, which
provides a unified, comprehensive, reproducible, and extensible implementation framework …