Deep learning with noisy labels: Exploring techniques and remedies in medical image analysis
Supervised training of deep learning models requires large labeled datasets. There is a
growing interest in obtaining such datasets for medical image analysis applications …
growing interest in obtaining such datasets for medical image analysis applications …
Towards robust pattern recognition: A review
The accuracies for many pattern recognition tasks have increased rapidly year by year,
achieving or even outperforming human performance. From the perspective of accuracy …
achieving or even outperforming human performance. From the perspective of accuracy …
Part-based pseudo label refinement for unsupervised person re-identification
Unsupervised person re-identification (re-ID) aims at learning discriminative representations
for person retrieval from unlabeled data. Recent techniques accomplish this task by using …
for person retrieval from unlabeled data. Recent techniques accomplish this task by using …
Partial success in closing the gap between human and machine vision
A few years ago, the first CNN surpassed human performance on ImageNet. However, it
soon became clear that machines lack robustness on more challenging test cases, a major …
soon became clear that machines lack robustness on more challenging test cases, a major …
Dividemix: Learning with noisy labels as semi-supervised learning
Deep neural networks are known to be annotation-hungry. Numerous efforts have been
devoted to reducing the annotation cost when learning with deep networks. Two prominent …
devoted to reducing the annotation cost when learning with deep networks. Two prominent …
Symmetric cross entropy for robust learning with noisy labels
Training accurate deep neural networks (DNNs) in the presence of noisy labels is an
important and challenging task. Though a number of approaches have been proposed for …
important and challenging task. Though a number of approaches have been proposed for …
Confident learning: Estimating uncertainty in dataset labels
Learning exists in the context of data, yet notions of confidence typically focus on model
predictions, not label quality. Confident learning (CL) is an alternative approach which …
predictions, not label quality. Confident learning (CL) is an alternative approach which …
Normalized loss functions for deep learning with noisy labels
Robust loss functions are essential for training accurate deep neural networks (DNNs) in the
presence of noisy (incorrect) labels. It has been shown that the commonly used Cross …
presence of noisy (incorrect) labels. It has been shown that the commonly used Cross …
Meta-weight-net: Learning an explicit mapping for sample weighting
Current deep neural networks (DNNs) can easily overfit to biased training data with
corrupted labels or class imbalance. Sample re-weighting strategy is commonly used to …
corrupted labels or class imbalance. Sample re-weighting strategy is commonly used to …
Unsupervised label noise modeling and loss correction
Despite being robust to small amounts of label noise, convolutional neural networks trained
with stochastic gradient methods have been shown to easily fit random labels. When there …
with stochastic gradient methods have been shown to easily fit random labels. When there …