Source data-absent unsupervised domain adaptation through hypothesis transfer and labeling transfer
Unsupervised domain adaptation (UDA) aims to transfer knowledge from a related but
different well-labeled source domain to a new unlabeled target domain. Most existing UDA …
different well-labeled source domain to a new unlabeled target domain. Most existing UDA …
Connect, not collapse: Explaining contrastive learning for unsupervised domain adaptation
We consider unsupervised domain adaptation (UDA), where labeled data from a source
domain (eg, photos) and unlabeled data from a target domain (eg, sketches) are used to …
domain (eg, photos) and unlabeled data from a target domain (eg, sketches) are used to …
Extending the wilds benchmark for unsupervised adaptation
Machine learning systems deployed in the wild are often trained on a source distribution but
deployed on a different target distribution. Unlabeled data can be a powerful point of …
deployed on a different target distribution. Unlabeled data can be a powerful point of …
Clda: Contrastive learning for semi-supervised domain adaptation
A Singh - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Abstract Unsupervised Domain Adaptation (UDA) aims to align the labeled source
distribution with the unlabeled target distribution to obtain domain invariant predictive …
distribution with the unlabeled target distribution to obtain domain invariant predictive …
Complementary benefits of contrastive learning and self-training under distribution shift
Self-training and contrastive learning have emerged as leading techniques for incorporating
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …
Subsidiary prototype alignment for universal domain adaptation
Abstract Universal Domain Adaptation (UniDA) deals with the problem of knowledge transfer
between two datasets with domain-shift as well as category-shift. The goal is to categorize …
between two datasets with domain-shift as well as category-shift. The goal is to categorize …
Concurrent subsidiary supervision for unsupervised source-free domain adaptation
The prime challenge in unsupervised domain adaptation (DA) is to mitigate the domain shift
between the source and target domains. Prior DA works show that pretext tasks could be …
between the source and target domains. Prior DA works show that pretext tasks could be …
Adaptive betweenness clustering for semi-supervised domain adaptation
Compared to unsupervised domain adaptation, semi-supervised domain adaptation (SSDA)
aims to significantly improve the classification performance and generalization capability of …
aims to significantly improve the classification performance and generalization capability of …
Inter-domain mixup for semi-supervised domain adaptation
Semi-supervised domain adaptation (SSDA) aims to bridge source and target domain
distributions, with a small number of target labels available, achieving better classification …
distributions, with a small number of target labels available, achieving better classification …
Source-free semi-supervised domain adaptation via progressive Mixup
Existing domain adaptation methods usually perform explicit representation alignment by
simultaneously accessing the source data and target data. However, the source data are not …
simultaneously accessing the source data and target data. However, the source data are not …