Named entity extraction for knowledge graphs: A literature overview
An enormous amount of digital information is expressed as natural-language (NL) text that is
not easily processable by computers. Knowledge Graphs (KG) offer a widely used format for …
not easily processable by computers. Knowledge Graphs (KG) offer a widely used format for …
Choosing transfer languages for cross-lingual learning
Cross-lingual transfer, where a high-resource transfer language is used to improve the
accuracy of a low-resource task language, is now an invaluable tool for improving …
accuracy of a low-resource task language, is now an invaluable tool for improving …
Neural entity linking: A survey of models based on deep learning
This survey presents a comprehensive description of recent neural entity linking (EL)
systems developed since 2015 as a result of the “deep learning revolution” in natural …
systems developed since 2015 as a result of the “deep learning revolution” in natural …
Entity linking in 100 languages
We propose a new formulation for multilingual entity linking, where language-specific
mentions resolve to a language-agnostic Knowledge Base. We train a dual encoder in this …
mentions resolve to a language-agnostic Knowledge Base. We train a dual encoder in this …
What can knowledge bring to machine learning?—a survey of low-shot learning for structured data
Supervised machine learning has several drawbacks that make it difficult to use in many
situations. Drawbacks include heavy reliance on massive training data, limited …
situations. Drawbacks include heavy reliance on massive training data, limited …
Re-examining the Role of Schema Linking in Text-to-SQL
In existing sophisticated text-to-SQL models, schema linking is often considered as a simple,
minor component, belying its importance. By providing a schema linking corpus based on …
minor component, belying its importance. By providing a schema linking corpus based on …
When being unseen from mBERT is just the beginning: Handling new languages with multilingual language models
Transfer learning based on pretraining language models on a large amount of raw data has
become a new norm to reach state-of-the-art performance in NLP. Still, it remains unclear …
become a new norm to reach state-of-the-art performance in NLP. Still, it remains unclear …
How linguistically fair are multilingual pre-trained language models?
M Choudhury, A Deshpande - Proceedings of the AAAI conference on …, 2021 - ojs.aaai.org
Massively multilingual pre-trained language models, such as mBERT and XLM-RoBERTa,
have received significant attention in the recent NLP literature for their excellent capability …
have received significant attention in the recent NLP literature for their excellent capability …
Cross-lingual few-shot learning on unseen languages
Large pre-trained language models (LMs) have demonstrated the ability to obtain good
performance on downstream tasks with limited examples in cross-lingual settings. However …
performance on downstream tasks with limited examples in cross-lingual settings. However …
Logic-guided semantic representation learning for zero-shot relation classification
Relation classification aims to extract semantic relations between entity pairs from the
sentences. However, most existing methods can only identify seen relation classes that …
sentences. However, most existing methods can only identify seen relation classes that …