A survey of mix-based data augmentation: Taxonomy, methods, applications, and explainability
Data augmentation (DA) is indispensable in modern machine learning and deep neural
networks. The basic idea of DA is to construct new training data to improve the model's …
networks. The basic idea of DA is to construct new training data to improve the model's …
Survey: Image mixing and deleting for data augmentation
Neural networks are prone to overfitting and memorizing data patterns. To avoid over-fitting
and enhance their generalization and performance, various methods have been suggested …
and enhance their generalization and performance, various methods have been suggested …
Openood v1. 5: Enhanced benchmark for out-of-distribution detection
Out-of-Distribution (OOD) detection is critical for the reliable operation of open-world
intelligent systems. Despite the emergence of an increasing number of OOD detection …
intelligent systems. Despite the emergence of an increasing number of OOD detection …
A whac-a-mole dilemma: Shortcuts come in multiples where mitigating one amplifies others
Abstract Machine learning models have been found to learn shortcuts---unintended decision
rules that are unable to generalize---undermining models' reliability. Previous works address …
rules that are unable to generalize---undermining models' reliability. Previous works address …
Openmix: Exploring outlier samples for misclassification detection
Reliable confidence estimation for deep neural classifiers is a challenging yet fundamental
requirement in high-stakes applications. Unfortunately, modern deep neural networks are …
requirement in high-stakes applications. Unfortunately, modern deep neural networks are …
Cal-DETR: calibrated detection transformer
Albeit revealing impressive predictive performance for several computer vision tasks, deep
neural networks (DNNs) are prone to making overconfident predictions. This limits the …
neural networks (DNNs) are prone to making overconfident predictions. This limits the …
Rankmixup: Ranking-based mixup training for network calibration
Network calibration aims to accurately estimate the level of confidences, which is particularly
important for employing deep neural networks in real-world systems. Recent approaches …
important for employing deep neural networks in real-world systems. Recent approaches …
Sample-dependent adaptive temperature scaling for improved calibration
It is now well known that neural networks can be wrong with high confidence in their
predictions, leading to poor calibration. The most common post-hoc approach to …
predictions, leading to poor calibration. The most common post-hoc approach to …
Latent discriminant deterministic uncertainty
Predictive uncertainty estimation is essential for deploying Deep Neural Networks in real-
world autonomous systems. However, most successful approaches are computationally …
world autonomous systems. However, most successful approaches are computationally …
Graph invariant learning with subgraph co-mixup for out-of-distribution generalization
Graph neural networks (GNNs) have been demonstrated to perform well in graph
representation learning, but always lacking of generalization capability when tackling out-of …
representation learning, but always lacking of generalization capability when tackling out-of …