Word Meaning Representation in Neural Language Models: Lexical Polysemy and Semantic Relationships

AG Soler - 2021 - theses.hal.science
Word embedding representations generated by neural language models encode rich
information about language and the world. In this thesis, we investigate the knowledge …

Word Meaning Representation and Interpretation in Vector Space

M Apidianaki - 2022 - hal.science
The analysis and representation of lexical meaning is a central topic in computational
linguistics research, with both theoretical and application-oriented interest. It allows to study …

Enhancing word representation learning with linguistic knowledge

D Ramírez Echavarría - 2022 - discovery.ucl.ac.uk
Representation learning, the process whereby representations are modelled from data, has
recently become a central part of Natural Language Processing (NLP). Among the most …

Lost in Context? On the Sense-Wise Variance of Contextualized Word Embeddings

Y Wang, Y Zhang - IEEE/ACM Transactions on Audio, Speech …, 2023 - ieeexplore.ieee.org
Contextualized word embeddings in language models have given much advance to NLP.
Intuitively, sentential information is integrated into the representation of words, which can …

[PDF][PDF] Improving lexical embeddings with semantic knowledge

M Yu, M Dredze - Proceedings of the 52nd Annual Meeting of the …, 2014 - aclanthology.org
Word embeddings learned on unlabeled data are a popular tool in semantics, but may not
capture the desired semantics. We propose a new learning objective that incorporates both …

Challenges and solutions with alignment and enrichment of word embedding models

CŞ Şahin, RS Caceres, B Oselio… - … Language Processing and …, 2017 - Springer
Word embedding models offer continuous vector representations that can capture rich
semantics of word co-occurrence patterns. Although these models have improved the state …

Unraveling lexical semantics in the brain: Comparing internal, external, and hybrid language models

Y Yang, L Li, S de Deyne, B Li, J Wang… - Human Brain …, 2024 - Wiley Online Library
To explain how the human brain represents and organizes meaning, many theoretical and
computational language models have been proposed over the years, varying in their …

[PDF][PDF] Indra: A word embedding and semantic relatedness server

JE Sales, L Souza, S Barzegar, B Davis… - Proceedings of the …, 2018 - aclanthology.org
In recent years word embedding/distributional semantic models evolved to become a
fundamental component in many natural language processing (NLP) architectures due to …

Lexical semantics enhanced neural word embeddings

D Yang, N Li, L Zou, H Ma - Knowledge-Based Systems, 2022 - Elsevier
Current breakthroughs in natural language processing have benefited dramatically from-
neural language models, through which distributional semantics can leverage neural data …

Sensembert: Context-enhanced sense embeddings for multilingual word sense disambiguation

B Scarlini, T Pasini, R Navigli - Proceedings of the AAAI conference on …, 2020 - ojs.aaai.org
Contextual representations of words derived by neural language models have proven to
effectively encode the subtle distinctions that might occur between different meanings of the …