A comprehensive survey on test-time adaptation under distribution shifts
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …
process that can effectively generalize to test samples, even in the presence of distribution …
Towards out-of-distribution generalization: A survey
Traditional machine learning paradigms are based on the assumption that both training and
test data follow the same statistical pattern, which is mathematically referred to as …
test data follow the same statistical pattern, which is mathematically referred to as …
Disentangling label distribution for long-tailed visual recognition
The current evaluation protocol of long-tailed visual recognition trains the classification
model on the long-tailed source label distribution and evaluates its performance on the …
model on the long-tailed source label distribution and evaluates its performance on the …
Wilds: A benchmark of in-the-wild distribution shifts
Distribution shifts—where the training distribution differs from the test distribution—can
substantially degrade the accuracy of machine learning (ML) systems deployed in the wild …
substantially degrade the accuracy of machine learning (ML) systems deployed in the wild …
Leveraging unlabeled data to predict out-of-distribution performance
Real-world machine learning deployments are characterized by mismatches between the
source (training) and target (test) distributions that may cause performance drops. In this …
source (training) and target (test) distributions that may cause performance drops. In this …
Improving generalization of machine learning-identified biomarkers using causal modelling with examples from immune receptor diagnostics
Abstract Machine learning is increasingly used to discover diagnostic and prognostic
biomarkers from high-dimensional molecular data. However, a variety of factors related to …
biomarkers from high-dimensional molecular data. However, a variety of factors related to …
Self-attention between datapoints: Going beyond individual input-output pairs in deep learning
We challenge a common assumption underlying most supervised deep learning: that a
model makes a prediction depending only on its parameters and the features of a single …
model makes a prediction depending only on its parameters and the features of a single …
Subspace identification for multi-source domain adaptation
Multi-source domain adaptation (MSDA) methods aim to transfer knowledge from multiple
labeled source domains to an unlabeled target domain. Although current methods achieve …
labeled source domains to an unlabeled target domain. Although current methods achieve …
Domain adaptation under open set label shift
S Garg, S Balakrishnan… - Advances in Neural …, 2022 - proceedings.neurips.cc
We introduce the problem of domain adaptation under Open Set Label Shift (OSLS), where
the label distribution can change arbitrarily and a new class may arrive during deployment …
the label distribution can change arbitrarily and a new class may arrive during deployment …
Complementary benefits of contrastive learning and self-training under distribution shift
Self-training and contrastive learning have emerged as leading techniques for incorporating
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …
unlabeled data, both under distribution shift (unsupervised domain adaptation) and when it …