Unsupervised learning of sentence embeddings using compositional n-gram features
The recent tremendous success of unsupervised word embeddings in a multitude of
applications raises the obvious question if similar methods could be derived to improve …
applications raises the obvious question if similar methods could be derived to improve …
Learning semantic textual similarity from conversations
We present a novel approach to learn representations for sentence-level semantic similarity
using conversational data. Our method trains an unsupervised model to predict …
using conversational data. Our method trains an unsupervised model to predict …
Unsupervised sentence representations as word information series: Revisiting TF–IDF
Sentence representation at the semantic level is a challenging task for natural language
processing and Artificial Intelligence. Despite the advances in word embeddings (ie word …
processing and Artificial Intelligence. Despite the advances in word embeddings (ie word …
An efficient framework for sentence similarity modeling
Sentence similarity modeling lies at the core of many natural language processing
applications, and thus has received much attention. Owing to the success of word …
applications, and thus has received much attention. Owing to the success of word …
A systematic study of inner-attention-based sentence representations in multilingual neural machine translation
Neural machine translation has considerably improved the quality of automatic translations
by learning good representations of input sentences. In this article, we explore a multilingual …
by learning good representations of input sentences. In this article, we explore a multilingual …
Neural bag-of-ngrams
Abstract Bag-of-ngrams (BoN) models are commonly used for representing text. One of the
main drawbacks of traditional BoN is the ignorance of n-gram's semantics. In this paper, we …
main drawbacks of traditional BoN is the ignorance of n-gram's semantics. In this paper, we …
A structured distributional model of sentence meaning and processing
Most compositional distributional semantic models represent sentence meaning with a
single vector. In this paper, we propose a structured distributional model (SDM) that …
single vector. In this paper, we propose a structured distributional model (SDM) that …
[HTML][HTML] The mechanism of additive composition
Additive composition (Foltz et al. in Discourse Process 15: 285–307, 1998; Landauer and
Dumais in Psychol Rev 104 (2): 211, 1997; Mitchell and Lapata in Cognit Sci 34 (8): 1388 …
Dumais in Psychol Rev 104 (2): 211, 1997; Mitchell and Lapata in Cognit Sci 34 (8): 1388 …
Training encoder model and/or using trained encoder model to determine responsive action (s) for natural language input
Abstract Systems, methods, and computer readable media related to: training an encoder
model that can be utilized to determine semantic similarity of a natural language textual …
model that can be utilized to determine semantic similarity of a natural language textual …
Using part of speech tagging for improving Word2vec model
D Suleiman, AA Awajan - 2019 2nd International Conference …, 2019 - ieeexplore.ieee.org
Word2vec is an efficient word embedding model that convert words to vectors by
considering syntax and semantic relationship between words. In this paper, an extension of …
considering syntax and semantic relationship between words. In this paper, an extension of …