[HTML][HTML] Deep Learning applications for COVID-19
This survey explores how Deep Learning has battled the COVID-19 pandemic and provides
directions for future research on COVID-19. We cover Deep Learning applications in Natural …
directions for future research on COVID-19. We cover Deep Learning applications in Natural …
Don't stop pretraining: Adapt language models to domains and tasks
Language models pretrained on text from a wide variety of sources form the foundation of
today's NLP. In light of the success of these broad-coverage models, we investigate whether …
today's NLP. In light of the success of these broad-coverage models, we investigate whether …
Stereotyping Norwegian salmon: An inventory of pitfalls in fairness benchmark datasets
Auditing NLP systems for computational harms like surfacing stereotypes is an elusive goal.
Several recent efforts have focused on benchmark datasets consisting of pairs of contrastive …
Several recent efforts have focused on benchmark datasets consisting of pairs of contrastive …
Beyond semantic distance: Automated scoring of divergent thinking greatly improves with large language models
Automated scoring for divergent thinking (DT) seeks to overcome a key obstacle to creativity
measurement: the effort, cost, and reliability of scoring open-ended tests. For a common test …
measurement: the effort, cost, and reliability of scoring open-ended tests. For a common test …
ARBERT & MARBERT: Deep bidirectional transformers for Arabic
M Abdul-Mageed, AR Elmadany… - arXiv preprint arXiv …, 2020 - arxiv.org
Pre-trained language models (LMs) are currently integral to many natural language
processing systems. Although multilingual LMs were also introduced to serve many …
processing systems. Although multilingual LMs were also introduced to serve many …
Ensemble distillation for robust model fusion in federated learning
Federated Learning (FL) is a machine learning setting where many devices collaboratively
train a machine learning model while keeping the training data decentralized. In most of the …
train a machine learning model while keeping the training data decentralized. In most of the …
IndicNLPSuite: Monolingual corpora, evaluation benchmarks and pre-trained multilingual language models for Indian languages
D Kakwani, A Kunchukuttan, S Golla… - Findings of the …, 2020 - aclanthology.org
In this paper, we introduce NLP resources for 11 major Indian languages from two major
language families. These resources include:(a) large-scale sentence-level monolingual …
language families. These resources include:(a) large-scale sentence-level monolingual …
Elevater: A benchmark and toolkit for evaluating language-augmented visual models
Learning visual representations from natural language supervision has recently shown great
promise in a number of pioneering works. In general, these language-augmented visual …
promise in a number of pioneering works. In general, these language-augmented visual …
Pre-trained models for natural language processing: A survey
Recently, the emergence of pre-trained models (PTMs) has brought natural language
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
Language models are few-shot learners
We demonstrate that scaling up language models greatly improves task-agnostic, few-shot
performance, sometimes even becoming competitive with prior state-of-the-art fine-tuning …
performance, sometimes even becoming competitive with prior state-of-the-art fine-tuning …