[HTML][HTML] Progress in machine translation

H Wang, H Wu, Z He, L Huang, KW Church - Engineering, 2022 - Elsevier
After more than 70 years of evolution, great achievements have been made in machine
translation. Especially in recent years, translation quality has been greatly improved with the …

[HTML][HTML] Neural machine translation: A review of methods, resources, and tools

Z Tan, S Wang, Z Yang, G Chen, X Huang, M Sun… - AI Open, 2020 - Elsevier
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …

Graph-to-sequence learning using gated graph neural networks

D Beck, G Haffari, T Cohn - arXiv preprint arXiv:1806.09835, 2018 - arxiv.org
Many NLP applications can be framed as a graph-to-sequence learning problem. Previous
work proposing neural architectures on this setting obtained promising results compared to …

Tree-to-sequence attentional neural machine translation

A Eriguchi, K Hashimoto, Y Tsuruoka - arXiv preprint arXiv:1603.06075, 2016 - arxiv.org
Most of the existing Neural Machine Translation (NMT) models focus on the conversion of
sequential data and do not directly use syntactic information. We propose a novel end-to …

Ordinal hyperplanes ranker with cost sensitivities for age estimation

KY Chang, CS Chen, YP Hung - CVPR 2011, 2011 - ieeexplore.ieee.org
In this paper, we propose an ordinal hyperplane ranking algorithm called OHRank, which
estimates human ages via facial images. The design of the algorithm is based on the relative …

Improved neural machine translation with a syntax-aware encoder and decoder

H Chen, S Huang, D Chiang, J Chen - arXiv preprint arXiv:1707.05436, 2017 - arxiv.org
Most neural machine translation (NMT) models are based on the sequential encoder-
decoder framework, which makes no use of syntactic information. In this paper, we improve …

Modeling source syntax for neural machine translation

J Li, D Xiong, Z Tu, M Zhu, M Zhang, G Zhou - arXiv preprint arXiv …, 2017 - arxiv.org
Even though a linguistics-free sequence to sequence model in neural machine translation
(NMT) has certain capability of implicitly learning syntactic information of source sentences …

[PDF][PDF] Compositional generalization for neural semantic parsing via span-level supervised attention

P Yin, H Fang, G Neubig, A Pauls, EA Platanios, Y Su… - 2021 - dspace.mit.edu
We describe a span-level supervised attention loss that improves compositional
generalization in semantic parsers. Our approach builds on existing losses that encourage …

[PDF][PDF] Forest rescoring: Faster decoding with integrated language models

L Huang, D Chiang - Proceedings of the 45th annual meeting of …, 2007 - aclanthology.org
Efficient decoding has been a fundamental problem in machine translation, especially with
an integrated language model which is essential for achieving good translation quality. We …

[PDF][PDF] Chinese syntactic reordering for statistical machine translation

C Wang, M Collins, P Koehn - … of the 2007 Joint Conference on …, 2007 - aclanthology.org
Syntactic reordering approaches are an effective method for handling word-order
differences between source and target languages in statistical machine translation (SMT) …