Are all losses created equal: A neural collapse perspective

J Zhou, C You, X Li, K Liu, S Liu… - Advances in Neural …, 2022 - proceedings.neurips.cc
While cross entropy (CE) is the most commonly used loss function to train deep neural
networks for classification tasks, many alternative losses have been developed to obtain …

Why do better loss functions lead to less transferable features?

S Kornblith, T Chen, H Lee… - Advances in Neural …, 2021 - proceedings.neurips.cc
Previous work has proposed many new loss functions and regularizers that improve test
accuracy on image classification tasks. However, it is not clear whether these loss functions …

Free lunch for domain adversarial training: Environment label smoothing

YF Zhang, X Wang, J Liang, Z Zhang, L Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
A fundamental challenge for machine learning models is how to generalize learned models
for out-of-distribution (OOD) data. Among various approaches, exploiting invariant features …

[HTML][HTML] Mapping retrogressive thaw slumps using deep neural networks

Y Yang, BM Rogers, G Fiske, J Watts, S Potter… - Remote Sensing of …, 2023 - Elsevier
Retrogressive thaw slumps (RTS) are thermokarst features in ice-rich hillslope permafrost
terrain, and their occurrence in the warming Arctic is increasingly frequent and has caused …

LE‐YOLOv5: A Lightweight and Efficient Road Damage Detection Algorithm Based on Improved YOLOv5

Z Diao, X Huang, H Liu, Z Liu - International Journal of …, 2023 - Wiley Online Library
Road damage detection is very important for road safety and timely repair. The previous
detection methods mainly rely on humans or large machines, which are costly and …

Understanding generalized label smoothing when learning with noisy labels

J Wei, H Liu, T Liu, G Niu, Y Liu - 2021 - openreview.net
Label smoothing (LS) is an arising learning paradigm that uses the positively weighted
average of both the hard training labels and uniformly distributed soft labels. It was shown …

Data Optimization in Deep Learning: A Survey

O Wu, R Yao - arXiv preprint arXiv:2310.16499, 2023 - arxiv.org
Large-scale, high-quality data are considered an essential factor for the successful
application of many deep learning techniques. Meanwhile, numerous real-world deep …

Towards test time domain adaptation via Negative Label Smoothing

H Yang, H Zuo, R Zhou, M Wang, Y Zhou - Neurocomputing, 2024 - Elsevier
Label Smoothing (LS) is a widely-used training technique that adjusts hard labels towards a
softer distribution, which prevents model being over-confidence and enhances model …

Deep label embedding learning for classification

P Nousi, A Tefas - Applied Soft Computing, 2024 - Elsevier
Abstract The one-hot 0/1 encoding method is the most popularized encoding method of
class labels for classification tasks. Despite its simplicity and popularity, it comes with …

Label Smoothing for Enhanced Text Sentiment Classification

Y Gao, S Si - arXiv preprint arXiv:2312.06522, 2023 - arxiv.org
Label smoothing is a widely used technique in various domains, such as image
classification and speech recognition, known for effectively combating model overfitting …