A comprehensive survey on test-time adaptation under distribution shifts

J Liang, R He, T Tan - International Journal of Computer Vision, 2024 - Springer
Abstract Machine learning methods strive to acquire a robust model during the training
process that can effectively generalize to test samples, even in the presence of distribution …

Learning from noisy labels with deep neural networks: A survey

H Song, M Kim, D Park, Y Shin… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Deep learning has achieved remarkable success in numerous domains with help from large
amounts of big data. However, the quality of data labels is a concern because of the lack of …

Large language model as attributed training data generator: A tale of diversity and bias

Y Yu, Y Zhuang, J Zhang, Y Meng… - Advances in …, 2024 - proceedings.neurips.cc
Large language models (LLMs) have been recently leveraged as training data generators
for various natural language processing (NLP) tasks. While previous research has explored …

Part-based pseudo label refinement for unsupervised person re-identification

Y Cho, WJ Kim, S Hong… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Unsupervised person re-identification (re-ID) aims at learning discriminative representations
for person retrieval from unlabeled data. Recent techniques accomplish this task by using …

-IoU: A Family of Power Intersection over Union Losses for Bounding Box Regression

J He, S Erfani, X Ma, J Bailey… - Advances in Neural …, 2021 - proceedings.neurips.cc
Bounding box (bbox) regression is a fundamental task in computer vision. So far, the most
commonly used loss functions for bbox regression are the Intersection over Union (IoU) loss …

Prototypical pseudo label denoising and target structure learning for domain adaptive semantic segmentation

P Zhang, B Zhang, T Zhang, D Chen… - Proceedings of the …, 2021 - openaccess.thecvf.com
Self-training is a competitive approach in domain adaptive segmentation, which trains the
network with the pseudo labels on the target domain. However inevitably, the pseudo labels …

Robust federated learning with noisy and heterogeneous clients

X Fang, M Ye - Proceedings of the IEEE/CVF Conference …, 2022 - openaccess.thecvf.com
Abstract Model heterogeneous federated learning is a challenging task since each client
independently designs its own model. Due to the annotation difficulty and free-riding …

Early-learning regularization prevents memorization of noisy labels

S Liu, J Niles-Weed, N Razavian… - Advances in neural …, 2020 - proceedings.neurips.cc
We propose a novel framework to perform classification via deep learning in the presence of
noisy annotations. When trained on noisy labels, deep neural networks have been observed …

Learning with noisy labels revisited: A study using real-world human annotations

J Wei, Z Zhu, H Cheng, T Liu, G Niu, Y Liu - arXiv preprint arXiv …, 2021 - arxiv.org
Existing research on learning with noisy labels mainly focuses on synthetic label noise.
Synthetic noise, though has clean structures which greatly enabled statistical analyses, often …

Understanding and improving early stopping for learning with noisy labels

Y Bai, E Yang, B Han, Y Yang, J Li… - Advances in …, 2021 - proceedings.neurips.cc
The memorization effect of deep neural network (DNN) plays a pivotal role in many state-of-
the-art label-noise learning methods. To exploit this property, the early stopping trick, which …