Subspace identification for multi-source domain adaptation

Z Li, R Cai, G Chen, B Sun, Z Hao… - Advances in Neural …, 2024 - proceedings.neurips.cc
Multi-source domain adaptation (MSDA) methods aim to transfer knowledge from multiple
labeled source domains to an unlabeled target domain. Although current methods achieve …

Complementary benefits of contrastive learning and self-training under distribution shift

S Garg, A Setlur, Z Lipton… - Advances in …, 2024 - proceedings.neurips.cc
Self-training and contrastive learning have emerged as leading techniques for incorporating
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …

Rlsbench: Domain adaptation under relaxed label shift

S Garg, N Erickson, J Sharpnack… - International …, 2023 - proceedings.mlr.press
Despite the emergence of principled methods for domain adaptation under label shift, their
sensitivity to shifts in class conditional distributions is precariously under explored …

ELSA: Efficient label shift adaptation through the lens of semiparametric models

Q Tian, X Zhang, J Zhao - International Conference on …, 2023 - proceedings.mlr.press
We study the domain adaptation problem with label shift in this work. Under the label shift
context, the marginal distribution of the label varies across the training and testing datasets …

Online label shift: Optimal dynamic regret meets practical algorithms

D Baby, S Garg, TC Yen… - Advances in …, 2024 - proceedings.neurips.cc
This paper focuses on supervised and unsupervised online label shift, where the class
marginals $ Q (y) $ variesbut the class-conditionals $ Q (x| y) $ remain invariant. In the …

Any-Shift Prompting for Generalization over Distributions

Z Xiao, J Shen, MM Derakhshani… - Proceedings of the …, 2024 - openaccess.thecvf.com
Image-language models with prompt learning have shown remarkable advances in
numerous downstream vision tasks. Nevertheless conventional prompt learning methods …

[PDF][PDF] Robust Machine Learning: Detection, Evaluation and Adaptation Under Distribution Shift

S Garg - 2024 - kilthub.cmu.edu
Deep learning, despite its broad applicability, grapples with robustness challenges in real-
world applications, especially when training and test distributions differ. Reasons for the …

[引用][C] Towards Reliable Machine Learning: Evaluating and Robustifying Deep Neural Networks

S Garg