Learning word vectors for 157 languages

E Grave, P Bojanowski, P Gupta, A Joulin… - arXiv preprint arXiv …, 2018 - arxiv.org
Distributed word representations, or word vectors, have recently been applied to many tasks
in natural language processing, leading to state-of-the-art performance. A key ingredient to …

Enriching word vectors with subword information

P Bojanowski, E Grave, A Joulin… - Transactions of the …, 2017 - direct.mit.edu
Continuous word representations, trained on large unlabeled corpora are useful for many
natural language processing tasks. Popular models that learn such representations ignore …

Word2vec convolutional neural networks for classification of news articles and tweets

B Jang, I Kim, JW Kim - PloS one, 2019 - journals.plos.org
Big web data from sources including online news and Twitter are good resources for
investigating deep learning. However, collected news articles and tweets almost certainly …

Attention-based long short-term memory network using sentiment lexicon embedding for aspect-level sentiment analysis in Korean

M Song, H Park, K Shin - Information Processing & Management, 2019 - Elsevier
Although deep learning breakthroughs in NLP are based on learning distributed word
representations by neural language models, these methods suffer from a classic drawback …

[HTML][HTML] Deep neural networks reveal topic-level representations of sentences in medial prefrontal cortex, lateral anterior temporal lobe, precuneus, and angular gyrus

DJ Acunzo, DM Low, SL Fairhall - NeuroImage, 2022 - Elsevier
When reading a sentence, individual words can be combined to create more complex
meaning. In this study, we sought to uncover brain regions that reflect the representation of …

General and feature-based semantic representations in the semantic network

AG Liuzzi, A Aglinskas, SL Fairhall - Scientific Reports, 2020 - nature.com
How semantic representations are manifest over the brain remains a topic of active debate.
A semantic representation may be determined by specific semantic features (eg …

Iterative annotation of biomedical ner corpora with deep neural networks and knowledge bases

S Silvestri, F Gargiulo, M Ciampi - Applied sciences, 2022 - mdpi.com
The large availability of clinical natural language documents, such as clinical narratives or
diagnoses, requires the definition of smart automatic systems for their processing and …

Subword-level word vector representations for Korean

S Park, J Byun, S Baek, Y Cho, A Oh - Proceedings of the 56th …, 2018 - aclanthology.org
Research on distributed word representations is focused on widely-used languages such as
English. Although the same methods can be used for other languages, language-specific …

Understanding the role of linguistic distributional knowledge in cognition

C Wingfield, L Connell - Language, Cognition and Neuroscience, 2022 - Taylor & Francis
The distributional pattern of words in language forms the basis of linguistic distributional
knowledge and contributes to conceptual processing, yet many questions remain regarding …

subs2vec: Word embeddings from subtitles in 55 languages

J Van Paridon, B Thompson - Behavior research methods, 2021 - Springer
This paper introduces a novel collection of word embeddings, numerical representations of
lexical semantics, in 55 languages, trained on a large corpus of pseudo-conversational …