[HTML][HTML] Neural machine translation: A review of methods, resources, and tools

Z Tan, S Wang, Z Yang, G Chen, X Huang, M Sun… - AI Open, 2020 - Elsevier
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …

Transformer: A general framework from machine translation to others

Y Zhao, J Zhang, C Zong - Machine Intelligence Research, 2023 - Springer
Abstract Machine translation is an important and challenging task that aims at automatically
translating natural language sentences from one language into another. Recently …

On the cross-lingual transferability of monolingual representations

M Artetxe, S Ruder, D Yogatama - arXiv preprint arXiv:1910.11856, 2019 - arxiv.org
State-of-the-art unsupervised multilingual models (eg, multilingual BERT) have been shown
to generalize in a zero-shot cross-lingual setting. This generalization ability has been …

Word translation without parallel data

A Conneau, G Lample, MA Ranzato, L Denoyer… - arXiv preprint arXiv …, 2017 - arxiv.org
State-of-the-art methods for learning cross-lingual word embeddings have relied on
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …

Unsupervised neural machine translation

M Artetxe, G Labaka, E Agirre, K Cho - arXiv preprint arXiv:1710.11041, 2017 - arxiv.org
In spite of the recent success of neural machine translation (NMT) in standard benchmarks,
the lack of large parallel corpora poses a major practical problem for many language pairs …

A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings

M Artetxe, G Labaka, E Agirre - arXiv preprint arXiv:1805.06297, 2018 - arxiv.org
Recent work has managed to learn cross-lingual word embeddings without parallel data by
mapping monolingual embeddings to a shared space through adversarial training …

[PDF][PDF] Word translation without parallel data

G Lample, A Conneau, MA Ranzato… - International …, 2018 - openreview.net
State-of-the-art methods for learning cross-lingual word embeddings have relied on
bilingual dictionaries or parallel corpora. Recent studies showed that the need for parallel …

A survey of cross-lingual word embedding models

S Ruder, I Vulić, A Søgaard - Journal of Artificial Intelligence Research, 2019 - jair.org
Cross-lingual representations of words enable us to reason about word meaning in
multilingual contexts and are a key facilitator of cross-lingual transfer when developing …

Emerging cross-lingual structure in pretrained language models

S Wu, A Conneau, H Li, L Zettlemoyer… - arXiv preprint arXiv …, 2019 - arxiv.org
We study the problem of multilingual masked language modeling, ie the training of a single
model on concatenated text from multiple languages, and present a detailed study of several …

Gromov-Wasserstein alignment of word embedding spaces

D Alvarez-Melis, TS Jaakkola - arXiv preprint arXiv:1809.00013, 2018 - arxiv.org
Cross-lingual or cross-domain correspondences play key roles in tasks ranging from
machine translation to transfer learning. Recently, purely unsupervised methods operating …