From word to sense embeddings: A survey on vector representations of meaning
J Camacho-Collados, MT Pilehvar - Journal of Artificial Intelligence …, 2018 - jair.org
Over the past years, distributed semantic representations have proved to be effective and
flexible keepers of prior knowledge to be integrated into downstream applications. This …
flexible keepers of prior knowledge to be integrated into downstream applications. This …
Polysemy—Evidence from Linguistics, Behavioral Science, and Contextualized Language Models
Polysemy is the type of lexical ambiguity where a word has multiple distinct but related
interpretations. In the past decade, it has been the subject of a great many studies across …
interpretations. In the past decade, it has been the subject of a great many studies across …
Grounding action descriptions in videos
M Regneri, M Rohrbach, D Wetzel, S Thater… - Transactions of the …, 2013 - direct.mit.edu
Recent work has shown that the integration of visual information into text-based models can
substantially improve model predictions, but so far only visual information extracted from …
substantially improve model predictions, but so far only visual information extracted from …
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
A Garí Soler, M Apidianaki - Transactions of the Association for …, 2021 - direct.mit.edu
Pre-trained language models (LMs) encode rich information about linguistic structure but
their knowledge about lexical polysemy remains unclear. We propose a novel experimental …
their knowledge about lexical polysemy remains unclear. We propose a novel experimental …
From word types to tokens and back: A survey of approaches to word meaning representation and interpretation
M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …
abstraction. Distributional and static embedding models generate a single vector per word …
Diachronic usage relatedness (DURel): A framework for the annotation of lexical semantic change
D Schlechtweg, SS Walde, S Eckmann - arXiv preprint arXiv:1804.06517, 2018 - arxiv.org
We propose a framework that extends synchronic polysemy annotation to diachronic
changes in lexical meaning, to counteract the lack of resources for evaluating computational …
changes in lexical meaning, to counteract the lack of resources for evaluating computational …
DWUG: A large resource of diachronic word usage graphs in four languages
Word meaning is notoriously difficult to capture, both synchronically and diachronically. In
this paper, we describe the creation of the largest resource of graded contextualized …
this paper, we describe the creation of the largest resource of graded contextualized …
[PDF][PDF] Semeval-2013 task 13: Word sense induction for graded and non-graded senses
D Jurgens, I Klapaftis - Second Joint Conference on Lexical and …, 2013 - aclanthology.org
Most work on word sense disambiguation has assumed that word usages are best labeled
with a single sense. However, contextual ambiguity or fine-grained senses can potentially …
with a single sense. However, contextual ambiguity or fine-grained senses can potentially …
An automatic approach to identify word sense changes in text media across timescales
In this paper, we propose an unsupervised and automated method to identify noun sense
changes based on rigorous analysis of time-varying text data available in the form of millions …
changes based on rigorous analysis of time-varying text data available in the form of millions …
Measuring word meaning in context
K Erk, D McCarthy, N Gaylord - Computational Linguistics, 2013 - direct.mit.edu
Word sense disambiguation (WSD) is an old and important task in computational linguistics
that still remains challenging, to machines as well as to human annotators. Recently there …
that still remains challenging, to machines as well as to human annotators. Recently there …