Unsupervised learning of sentence embeddings using compositional n-gram features

M Pagliardini, P Gupta, M Jaggi - arXiv preprint arXiv:1703.02507, 2017 - arxiv.org
The recent tremendous success of unsupervised word embeddings in a multitude of
applications raises the obvious question if similar methods could be derived to improve …

Learning semantic textual similarity from conversations

Y Yang, S Yuan, D Cer, S Kong, N Constant… - arXiv preprint arXiv …, 2018 - arxiv.org
We present a novel approach to learn representations for sentence-level semantic similarity
using conversational data. Our method trains an unsupervised model to predict …

Unsupervised sentence representations as word information series: Revisiting TF–IDF

I Arroyo-Fernández, CF Méndez-Cruz, G Sierra… - Computer Speech & …, 2019 - Elsevier
Sentence representation at the semantic level is a challenging task for natural language
processing and Artificial Intelligence. Despite the advances in word embeddings (ie word …

An efficient framework for sentence similarity modeling

Z Quan, ZJ Wang, Y Le, B Yao, K Li… - IEEE/ACM Transactions …, 2019 - ieeexplore.ieee.org
Sentence similarity modeling lies at the core of many natural language processing
applications, and thus has received much attention. Owing to the success of word …

A systematic study of inner-attention-based sentence representations in multilingual neural machine translation

R Vázquez, A Raganato, M Creutz… - Computational …, 2020 - direct.mit.edu
Neural machine translation has considerably improved the quality of automatic translations
by learning good representations of input sentences. In this article, we explore a multilingual …

Neural bag-of-ngrams

B Li, T Liu, Z Zhao, P Wang, X Du - … of the AAAI Conference on Artificial …, 2017 - ojs.aaai.org
Abstract Bag-of-ngrams (BoN) models are commonly used for representing text. One of the
main drawbacks of traditional BoN is the ignorance of n-gram's semantics. In this paper, we …

A structured distributional model of sentence meaning and processing

E Chersoni, E Santus, L Pannitto, A Lenci… - Natural Language …, 2019 - cambridge.org
Most compositional distributional semantic models represent sentence meaning with a
single vector. In this paper, we propose a structured distributional model (SDM) that …

[HTML][HTML] The mechanism of additive composition

R Tian, N Okazaki, K Inui - Machine Learning, 2017 - Springer
Additive composition (Foltz et al. in Discourse Process 15: 285–307, 1998; Landauer and
Dumais in Psychol Rev 104 (2): 211, 1997; Mitchell and Lapata in Cognit Sci 34 (8): 1388 …

Training encoder model and/or using trained encoder model to determine responsive action (s) for natural language input

B Strope, Y Sung, W Yuan - US Patent 10,783,456, 2020 - Google Patents
Abstract Systems, methods, and computer readable media related to: training an encoder
model that can be utilized to determine semantic similarity of a natural language textual …

Using part of speech tagging for improving Word2vec model

D Suleiman, AA Awajan - 2019 2nd International Conference …, 2019 - ieeexplore.ieee.org
Word2vec is an efficient word embedding model that convert words to vectors by
considering syntax and semantic relationship between words. In this paper, an extension of …