[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Transformer: A general framework from machine translation to others
Abstract Machine translation is an important and challenging task that aims at automatically
translating natural language sentences from one language into another. Recently …
translating natural language sentences from one language into another. Recently …
On the cross-lingual transferability of monolingual representations
State-of-the-art unsupervised multilingual models (eg, multilingual BERT) have been shown
to generalize in a zero-shot cross-lingual setting. This generalization ability has been …
to generalize in a zero-shot cross-lingual setting. This generalization ability has been …
Word translation without parallel data
State-of-the-art methods for learning cross-lingual word embeddings have relied on
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …
Unsupervised neural machine translation
In spite of the recent success of neural machine translation (NMT) in standard benchmarks,
the lack of large parallel corpora poses a major practical problem for many language pairs …
the lack of large parallel corpora poses a major practical problem for many language pairs …
A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings
Recent work has managed to learn cross-lingual word embeddings without parallel data by
mapping monolingual embeddings to a shared space through adversarial training …
mapping monolingual embeddings to a shared space through adversarial training …
[PDF][PDF] Word translation without parallel data
State-of-the-art methods for learning cross-lingual word embeddings have relied on
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …
A survey of cross-lingual word embedding models
Cross-lingual representations of words enable us to reason about word meaning in
multilingual contexts and are a key facilitator of cross-lingual transfer when developing …
multilingual contexts and are a key facilitator of cross-lingual transfer when developing …
Emerging cross-lingual structure in pretrained language models
We study the problem of multilingual masked language modeling, ie the training of a single
model on concatenated text from multiple languages, and present a detailed study of several …
model on concatenated text from multiple languages, and present a detailed study of several …
Gromov-Wasserstein alignment of word embedding spaces
D Alvarez-Melis, TS Jaakkola - arXiv preprint arXiv:1809.00013, 2018 - arxiv.org
Cross-lingual or cross-domain correspondences play key roles in tasks ranging from
machine translation to transfer learning. Recently, purely unsupervised methods operating …
machine translation to transfer learning. Recently, purely unsupervised methods operating …