Word Meaning Representation in Neural Language Models: Lexical Polysemy and Semantic Relationships
AG Soler - 2021 - theses.hal.science
Word embedding representations generated by neural language models encode rich
information about language and the world. In this thesis, we investigate the knowledge …
information about language and the world. In this thesis, we investigate the knowledge …
Word Meaning Representation and Interpretation in Vector Space
M Apidianaki - 2022 - hal.science
The analysis and representation of lexical meaning is a central topic in computational
linguistics research, with both theoretical and application-oriented interest. It allows to study …
linguistics research, with both theoretical and application-oriented interest. It allows to study …
Enhancing word representation learning with linguistic knowledge
D Ramírez Echavarría - 2022 - discovery.ucl.ac.uk
Representation learning, the process whereby representations are modelled from data, has
recently become a central part of Natural Language Processing (NLP). Among the most …
recently become a central part of Natural Language Processing (NLP). Among the most …
Lost in Context? On the Sense-Wise Variance of Contextualized Word Embeddings
Y Wang, Y Zhang - IEEE/ACM Transactions on Audio, Speech …, 2023 - ieeexplore.ieee.org
Contextualized word embeddings in language models have given much advance to NLP.
Intuitively, sentential information is integrated into the representation of words, which can …
Intuitively, sentential information is integrated into the representation of words, which can …
[PDF][PDF] Improving lexical embeddings with semantic knowledge
Word embeddings learned on unlabeled data are a popular tool in semantics, but may not
capture the desired semantics. We propose a new learning objective that incorporates both …
capture the desired semantics. We propose a new learning objective that incorporates both …
Challenges and solutions with alignment and enrichment of word embedding models
Word embedding models offer continuous vector representations that can capture rich
semantics of word co-occurrence patterns. Although these models have improved the state …
semantics of word co-occurrence patterns. Although these models have improved the state …
Unraveling lexical semantics in the brain: Comparing internal, external, and hybrid language models
To explain how the human brain represents and organizes meaning, many theoretical and
computational language models have been proposed over the years, varying in their …
computational language models have been proposed over the years, varying in their …
[PDF][PDF] Indra: A word embedding and semantic relatedness server
In recent years word embedding/distributional semantic models evolved to become a
fundamental component in many natural language processing (NLP) architectures due to …
fundamental component in many natural language processing (NLP) architectures due to …
Lexical semantics enhanced neural word embeddings
D Yang, N Li, L Zou, H Ma - Knowledge-Based Systems, 2022 - Elsevier
Current breakthroughs in natural language processing have benefited dramatically from-
neural language models, through which distributional semantics can leverage neural data …
neural language models, through which distributional semantics can leverage neural data …
Sensembert: Context-enhanced sense embeddings for multilingual word sense disambiguation
Contextual representations of words derived by neural language models have proven to
effectively encode the subtle distinctions that might occur between different meanings of the …
effectively encode the subtle distinctions that might occur between different meanings of the …