A survey of mix-based data augmentation: Taxonomy, methods, applications, and explainability

C Cao, F Zhou, Y Dai, J Wang, K Zhang - ACM Computing Surveys, 2022 - dl.acm.org
Data augmentation (DA) is indispensable in modern machine learning and deep neural
networks. The basic idea of DA is to construct new training data to improve the model's …

Survey: Image mixing and deleting for data augmentation

H Naveed, S Anwar, M Hayat, K Javed… - Engineering Applications of …, 2024 - Elsevier
Neural networks are prone to overfitting and memorizing data patterns. To avoid over-fitting
and enhance their generalization and performance, various methods have been suggested …

Openood v1. 5: Enhanced benchmark for out-of-distribution detection

J Zhang, J Yang, P Wang, H Wang, Y Lin… - arXiv preprint arXiv …, 2023 - arxiv.org
Out-of-Distribution (OOD) detection is critical for the reliable operation of open-world
intelligent systems. Despite the emergence of an increasing number of OOD detection …

A whac-a-mole dilemma: Shortcuts come in multiples where mitigating one amplifies others

Z Li, I Evtimov, A Gordo, C Hazirbas… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Machine learning models have been found to learn shortcuts---unintended decision
rules that are unable to generalize---undermining models' reliability. Previous works address …

Openmix: Exploring outlier samples for misclassification detection

F Zhu, Z Cheng, XY Zhang… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Reliable confidence estimation for deep neural classifiers is a challenging yet fundamental
requirement in high-stakes applications. Unfortunately, modern deep neural networks are …

Cal-DETR: calibrated detection transformer

MA Munir, SH Khan, MH Khan, M Ali… - Advances in neural …, 2024 - proceedings.neurips.cc
Albeit revealing impressive predictive performance for several computer vision tasks, deep
neural networks (DNNs) are prone to making overconfident predictions. This limits the …

Rankmixup: Ranking-based mixup training for network calibration

J Noh, H Park, J Lee, B Ham - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Network calibration aims to accurately estimate the level of confidences, which is particularly
important for employing deep neural networks in real-world systems. Recent approaches …

Sample-dependent adaptive temperature scaling for improved calibration

T Joy, F Pinto, SN Lim, PHS Torr… - Proceedings of the AAAI …, 2023 - ojs.aaai.org
It is now well known that neural networks can be wrong with high confidence in their
predictions, leading to poor calibration. The most common post-hoc approach to …

Latent discriminant deterministic uncertainty

G Franchi, X Yu, A Bursuc, E Aldea… - … on Computer Vision, 2022 - Springer
Predictive uncertainty estimation is essential for deploying Deep Neural Networks in real-
world autonomous systems. However, most successful approaches are computationally …

Graph invariant learning with subgraph co-mixup for out-of-distribution generalization

T Jia, H Li, C Yang, T Tao, C Shi - … of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Graph neural networks (GNNs) have been demonstrated to perform well in graph
representation learning, but always lacking of generalization capability when tackling out-of …