[HTML][HTML] Progress in machine translation
After more than 70 years of evolution, great achievements have been made in machine
translation. Especially in recent years, translation quality has been greatly improved with the …
translation. Especially in recent years, translation quality has been greatly improved with the …
[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Graph-to-sequence learning using gated graph neural networks
Many NLP applications can be framed as a graph-to-sequence learning problem. Previous
work proposing neural architectures on this setting obtained promising results compared to …
work proposing neural architectures on this setting obtained promising results compared to …
Tree-to-sequence attentional neural machine translation
Most of the existing Neural Machine Translation (NMT) models focus on the conversion of
sequential data and do not directly use syntactic information. We propose a novel end-to …
sequential data and do not directly use syntactic information. We propose a novel end-to …
Ordinal hyperplanes ranker with cost sensitivities for age estimation
In this paper, we propose an ordinal hyperplane ranking algorithm called OHRank, which
estimates human ages via facial images. The design of the algorithm is based on the relative …
estimates human ages via facial images. The design of the algorithm is based on the relative …
Improved neural machine translation with a syntax-aware encoder and decoder
Most neural machine translation (NMT) models are based on the sequential encoder-
decoder framework, which makes no use of syntactic information. In this paper, we improve …
decoder framework, which makes no use of syntactic information. In this paper, we improve …
Modeling source syntax for neural machine translation
Even though a linguistics-free sequence to sequence model in neural machine translation
(NMT) has certain capability of implicitly learning syntactic information of source sentences …
(NMT) has certain capability of implicitly learning syntactic information of source sentences …
[PDF][PDF] Compositional generalization for neural semantic parsing via span-level supervised attention
We describe a span-level supervised attention loss that improves compositional
generalization in semantic parsers. Our approach builds on existing losses that encourage …
generalization in semantic parsers. Our approach builds on existing losses that encourage …
[PDF][PDF] Forest rescoring: Faster decoding with integrated language models
Efficient decoding has been a fundamental problem in machine translation, especially with
an integrated language model which is essential for achieving good translation quality. We …
an integrated language model which is essential for achieving good translation quality. We …
[PDF][PDF] Chinese syntactic reordering for statistical machine translation
Syntactic reordering approaches are an effective method for handling word-order
differences between source and target languages in statistical machine translation (SMT) …
differences between source and target languages in statistical machine translation (SMT) …