Are all losses created equal: A neural collapse perspective
While cross entropy (CE) is the most commonly used loss function to train deep neural
networks for classification tasks, many alternative losses have been developed to obtain …
networks for classification tasks, many alternative losses have been developed to obtain …
Why do better loss functions lead to less transferable features?
Previous work has proposed many new loss functions and regularizers that improve test
accuracy on image classification tasks. However, it is not clear whether these loss functions …
accuracy on image classification tasks. However, it is not clear whether these loss functions …
Free lunch for domain adversarial training: Environment label smoothing
A fundamental challenge for machine learning models is how to generalize learned models
for out-of-distribution (OOD) data. Among various approaches, exploiting invariant features …
for out-of-distribution (OOD) data. Among various approaches, exploiting invariant features …
[HTML][HTML] Mapping retrogressive thaw slumps using deep neural networks
Retrogressive thaw slumps (RTS) are thermokarst features in ice-rich hillslope permafrost
terrain, and their occurrence in the warming Arctic is increasingly frequent and has caused …
terrain, and their occurrence in the warming Arctic is increasingly frequent and has caused …
LE‐YOLOv5: A Lightweight and Efficient Road Damage Detection Algorithm Based on Improved YOLOv5
Z Diao, X Huang, H Liu, Z Liu - International Journal of …, 2023 - Wiley Online Library
Road damage detection is very important for road safety and timely repair. The previous
detection methods mainly rely on humans or large machines, which are costly and …
detection methods mainly rely on humans or large machines, which are costly and …
Understanding generalized label smoothing when learning with noisy labels
Label smoothing (LS) is an arising learning paradigm that uses the positively weighted
average of both the hard training labels and uniformly distributed soft labels. It was shown …
average of both the hard training labels and uniformly distributed soft labels. It was shown …
Data Optimization in Deep Learning: A Survey
O Wu, R Yao - arXiv preprint arXiv:2310.16499, 2023 - arxiv.org
Large-scale, high-quality data are considered an essential factor for the successful
application of many deep learning techniques. Meanwhile, numerous real-world deep …
application of many deep learning techniques. Meanwhile, numerous real-world deep …
Towards test time domain adaptation via Negative Label Smoothing
Label Smoothing (LS) is a widely-used training technique that adjusts hard labels towards a
softer distribution, which prevents model being over-confidence and enhances model …
softer distribution, which prevents model being over-confidence and enhances model …
Deep label embedding learning for classification
Abstract The one-hot 0/1 encoding method is the most popularized encoding method of
class labels for classification tasks. Despite its simplicity and popularity, it comes with …
class labels for classification tasks. Despite its simplicity and popularity, it comes with …
Label Smoothing for Enhanced Text Sentiment Classification
Y Gao, S Si - arXiv preprint arXiv:2312.06522, 2023 - arxiv.org
Label smoothing is a widely used technique in various domains, such as image
classification and speech recognition, known for effectively combating model overfitting …
classification and speech recognition, known for effectively combating model overfitting …