Learning gender-neutral word embeddings
Word embedding models have become a fundamental component in a wide range of
Natural Language Processing (NLP) applications. However, embeddings trained on human …
Natural Language Processing (NLP) applications. However, embeddings trained on human …
From word to sense embeddings: A survey on vector representations of meaning
J Camacho-Collados, MT Pilehvar - Journal of Artificial Intelligence …, 2018 - jair.org
Over the past years, distributed semantic representations have proved to be effective and
flexible keepers of prior knowledge to be integrated into downstream applications. This …
flexible keepers of prior knowledge to be integrated into downstream applications. This …
AspEm: Embedding Learning by Aspects in Heterogeneous Information Networks
Heterogeneous information networks (HINs) are ubiquitous in real-world applications. Due
to the heterogeneity in HINs, the typed edges may not fully align with each other. In order to …
to the heterogeneity in HINs, the typed edges may not fully align with each other. In order to …
From word types to tokens and back: A survey of approaches to word meaning representation and interpretation
M Apidianaki - Computational Linguistics, 2023 - direct.mit.edu
Vector-based word representation paradigms situate lexical meaning at different levels of
abstraction. Distributional and static embedding models generate a single vector per word …
abstraction. Distributional and static embedding models generate a single vector per word …
Hyperlex: A large-scale evaluation of graded lexical entailment
We introduce HyperLex—a data set and evaluation resource that quantifies the extent of the
semantic category membership, that is, type-of relation, also known as hyponymy …
semantic category membership, that is, type-of relation, also known as hyponymy …
Embedding words and senses together via joint knowledge-enhanced training
Word embeddings are widely used in Natural Language Processing, mainly due to their
success in capturing semantic information from massive corpora. However, their creation …
success in capturing semantic information from massive corpora. However, their creation …
Semantic parsing with semi-supervised sequential autoencoders
We present a novel semi-supervised approach for sequence transduction and apply it to
semantic parsing. The unsupervised component is based on a generative model in which …
semantic parsing. The unsupervised component is based on a generative model in which …
Handling homographs in neural machine translation
Homographs, words with different meanings but the same surface form, have long caused
difficulty for machine translation systems, as it is difficult to select the correct translation …
difficulty for machine translation systems, as it is difficult to select the correct translation …
Weakly supervised text classification using supervision signals from a language model
Solving text classification in a weakly supervised manner is important for real-world
applications where human annotations are scarce. In this paper, we propose to query a …
applications where human annotations are scarce. In this paper, we propose to query a …
Towards a seamless integration of word senses into downstream NLP applications
Lexical ambiguity can impede NLP systems from accurate understanding of semantics.
Despite its potential benefits, the integration of sense-level information into NLP systems has …
Despite its potential benefits, the integration of sense-level information into NLP systems has …