[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …
uncertainties during both optimization and decision making processes. They have been …
In defense of pseudo-labeling: An uncertainty-aware pseudo-label selection framework for semi-supervised learning
The recent research in semi-supervised learning (SSL) is mostly dominated by consistency
regularization based methods which achieve strong performance. However, they heavily …
regularization based methods which achieve strong performance. However, they heavily …
Want to reduce labeling cost? GPT-3 can help
Data annotation is a time-consuming and labor-intensive process for many NLP tasks.
Although there exist various methods to produce pseudo data labels, they are often task …
Although there exist various methods to produce pseudo data labels, they are often task …
[HTML][HTML] Self-training: A survey
Self-training methods have gained significant attention in recent years due to their
effectiveness in leveraging small labeled datasets and large unlabeled observations for …
effectiveness in leveraging small labeled datasets and large unlabeled observations for …
Uncertainty in natural language processing: Sources, quantification, and applications
As a main field of artificial intelligence, natural language processing (NLP) has achieved
remarkable success via deep neural networks. Plenty of NLP tasks have been addressed in …
remarkable success via deep neural networks. Plenty of NLP tasks have been addressed in …
Cycle self-training for domain adaptation
Mainstream approaches for unsupervised domain adaptation (UDA) learn domain-invariant
representations to narrow the domain shift, which are empirically effective but theoretically …
representations to narrow the domain shift, which are empirically effective but theoretically …
Exploiting completeness and uncertainty of pseudo labels for weakly supervised video anomaly detection
Weakly supervised video anomaly detection aims to identify abnormal events in videos
using only video-level labels. Recently, two-stage self-training methods have achieved …
using only video-level labels. Recently, two-stage self-training methods have achieved …
Meta-based self-training and re-weighting for aspect-based sentiment analysis
Aspect-based sentiment analysis (ABSA) means to identify fine-grained aspects, opinions,
and sentiment polarities. Recent ABSA research focuses on utilizing multi-task learning …
and sentiment polarities. Recent ABSA research focuses on utilizing multi-task learning …
Few-shot named entity recognition: An empirical baseline study
This paper presents an empirical study to efficiently build named entity recognition (NER)
systems when a small amount of in-domain labeled data is available. Based upon recent …
systems when a small amount of in-domain labeled data is available. Based upon recent …
Fine-tuning pre-trained language model with weak supervision: A contrastive-regularized self-training approach
Fine-tuned pre-trained language models (LMs) have achieved enormous success in many
natural language processing (NLP) tasks, but they still require excessive labeled data in the …
natural language processing (NLP) tasks, but they still require excessive labeled data in the …