Survey of low-resource machine translation

B Haddow, R Bawden, AVM Barone, J Helcl… - Computational …, 2022 - direct.mit.edu
We present a survey covering the state of the art in low-resource machine translation (MT)
research. There are currently around 7,000 languages spoken in the world and almost all …

Findings of the 2021 conference on machine translation (WMT21)

A Farhad, A Arkady, B Magdalena, B Ondřej… - Proceedings of the …, 2021 - cris.fbk.eu
This paper presents the results of the news translation task, the multilingual low-resource
translation for Indo-European languages, the triangular translation task, and the automatic …

Findings of the WMT 2022 shared tasks in unsupervised MT and very low resource supervised MT

M Weller-Di Marco, A Fraser - Proceedings of the Seventh …, 2022 - aclanthology.org
We present the findings of the WMT2022Shared Tasks in Unsupervised MT and VeryLow
Resource Supervised MT with experiments on the language pairs German to/fromUpper …

HW-TSC systems for WMT22 very low resource supervised MT Task

S Li, Y Luo, D Wei, Z Li, H Shang, X Chen… - Proceedings of the …, 2022 - aclanthology.org
This paper describes the submissions of Huawei translation services center (HW-TSC) to the
WMT22 Very Low Resource Supervised MT task. We participate in all 6 supervised tracks …

Small batch sizes improve training of low-resource neural mt

ÀR Atrio, A Popescu-Belis - arXiv preprint arXiv:2203.10579, 2022 - arxiv.org
We study the role of an essential hyper-parameter that governs the training of Transformers
for neural machine translation in a low-resource setting: the batch size. Using theoretical …

On the interaction of regularization factors in low-resource neural machine translation

ÀR Atrio, A Popescu-Belis - … of the 23rd Annual Conference of the …, 2022 - arodes.hes-so.ch
Résumé We explore the roles and interactions of the hyper-parameters governing
regularization, and propose a range of values applicable to low-resource neural machine …

Investigating Neural Machine Translation for Low-Resource Languages: Using Bavarian as a Case Study

WH Her, U Kruschwitz - arXiv preprint arXiv:2404.08259, 2024 - arxiv.org
Machine Translation has made impressive progress in recent years offering close to human-
level performance on many languages, but studies have primarily focused on high-resource …

The AIC system for the WMT 2022 unsupervised MT and very low resource supervised MT task

A Shapiro, M Salama, O Abdelhakim… - Proceedings of the …, 2022 - aclanthology.org
This paper presents our submissions to WMT 22 shared task in the Unsupervised and Very
Low Resource Supervised Machine Translation tasks. The task revolves around translating …

Learning an artificial language for knowledge-sharing in multilingual translation

D Liu, J Niehues - arXiv preprint arXiv:2211.01292, 2022 - arxiv.org
The cornerstone of multilingual neural translation is shared representations across
languages. Given the theoretically infinite representation power of neural networks …

Translation memories as baselines for low-resource machine translation

R Knowles, P Littell - … of the Thirteenth Language Resources and …, 2022 - aclanthology.org
Low-resource machine translation research often requires building baselines to benchmark
estimates of progress in translation quality. Neural and statistical phrase-based systems are …