Neural machine translation: A review
F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …
natural language into another, has experienced a major paradigm shift in recent years …
[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Mask-predict: Parallel decoding of conditional masked language models
Most machine translation systems generate text autoregressively from left to right. We,
instead, use a masked language modeling objective to train a model to predict any subset of …
instead, use a masked language modeling objective to train a model to predict any subset of …
A survey on non-autoregressive generation for neural machine translation and beyond
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …
(NMT) to speed up inference, has attracted much attention in both machine learning and …
Glancing transformer for non-autoregressive neural machine translation
Recent work on non-autoregressive neural machine translation (NAT) aims at improving the
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …
Flowseq: Non-autoregressive conditional sequence generation with generative flow
Most sequence-to-sequence (seq2seq) models are autoregressive; they generate each
token by conditioning on previously generated tokens. In contrast, non-autoregressive …
token by conditioning on previously generated tokens. In contrast, non-autoregressive …
Deep encoder, shallow decoder: Reevaluating non-autoregressive machine translation
Much recent effort has been invested in non-autoregressive neural machine translation,
which appears to be an efficient alternative to state-of-the-art autoregressive machine …
which appears to be an efficient alternative to state-of-the-art autoregressive machine …
Fully non-autoregressive neural machine translation: Tricks of the trade
Fully non-autoregressive neural machine translation (NAT) is proposed to simultaneously
predict tokens with single forward of neural networks, which significantly reduces the …
predict tokens with single forward of neural networks, which significantly reduces the …
Directed acyclic transformer for non-autoregressive machine translation
Abstract Non-autoregressive Transformers (NATs) significantly reduce the decoding latency
by generating all tokens in parallel. However, such independent predictions prevent NATs …
by generating all tokens in parallel. However, such independent predictions prevent NATs …
Non-autoregressive machine translation with latent alignments
This paper presents two strong methods, CTC and Imputer, for non-autoregressive machine
translation that model latent alignments with dynamic programming. We revisit CTC for …
translation that model latent alignments with dynamic programming. We revisit CTC for …