Breaking through the 80% glass ceiling: Raising the state of the art in word sense disambiguation by incorporating knowledge graph information

M Bevilacqua, R Navigli - Proceedings of the conference …, 2020 - iris.uniroma1.it
Neural architectures are the current state of the art in Word Sense Disambiguation (WSD).
However, they make limited use of the vast amount of relational information encoded in …

Flaubert: Unsupervised language model pre-training for french

H Le, L Vial, J Frej, V Segonne, M Coavoux… - arXiv preprint arXiv …, 2019 - arxiv.org
Language models have become a key step to achieve state-of-the art results in many
different Natural Language Processing (NLP) tasks. Leveraging the huge amount of …

[PDF][PDF] Recent trends in word sense disambiguation: A survey

M Bevilacqua, T Pasini… - … Joint Conference on …, 2021 - researchportal.helsinki.fi
Abstract Word Sense Disambiguation (WSD) aims at making explicit the semantics of a word
in context by identifying the most suitable meaning from a predefined sense inventory …

GlossBERT: BERT for word sense disambiguation with gloss knowledge

L Huang, C Sun, X Qiu, X Huang - arXiv preprint arXiv:1908.07245, 2019 - arxiv.org
Word Sense Disambiguation (WSD) aims to find the exact sense of an ambiguous word in a
particular context. Traditional supervised methods rarely take into consideration the lexical …

A survey on semantic processing techniques

R Mao, K He, X Zhang, G Chen, J Ni, Z Yang… - Information …, 2024 - Elsevier
Semantic processing is a fundamental research domain in computational linguistics. In the
era of powerful pre-trained language models and large language models, the advancement …

Does BERT make any sense? Interpretable word sense disambiguation with contextualized embeddings

G Wiedemann, S Remus, A Chawla… - arXiv preprint arXiv …, 2019 - arxiv.org
Contextualized word embeddings (CWE) such as provided by ELMo (Peters et al., 2018),
Flair NLP (Akbik et al., 2018), or BERT (Devlin et al., 2019) are a major recent innovation in …

Moving down the long tail of word sense disambiguation with gloss-informed biencoders

T Blevins, L Zettlemoyer - arXiv preprint arXiv:2005.02590, 2020 - arxiv.org
A major obstacle in Word Sense Disambiguation (WSD) is that word senses are not
uniformly distributed, causing existing models to generally perform poorly on senses that are …

With more contexts comes better performance: Contextualized sense embeddings for all-round word sense disambiguation

B Scarlini, T Pasini, R Navigli - Proceedings of the 2020 …, 2020 - iris.uniroma1.it
Contextualized word embeddings have been employed effectively across several tasks in
Natural Language Processing, as they have proved to carry useful semantic information …

ConSeC: Word sense disambiguation as continuous sense comprehension

E Barba, L Procopio, R Navigli - Proceedings of the 2021 …, 2021 - aclanthology.org
Supervised systems have nowadays become the standard recipe for Word Sense
Disambiguation (WSD), with Transformer-based language models as their primary …

Sensembert: Context-enhanced sense embeddings for multilingual word sense disambiguation

B Scarlini, T Pasini, R Navigli - Proceedings of the AAAI conference on …, 2020 - ojs.aaai.org
Contextual representations of words derived by neural language models have proven to
effectively encode the subtle distinctions that might occur between different meanings of the …