关注
Longyue Wang
Longyue Wang
Tencent AI Lab
在 tencent.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Siren's song in the AI ocean: a survey on hallucination in large language models
Y Zhang, Y Li, L Cui, D Cai, L Liu, T Fu, X Huang, E Zhao, Y Zhang, ...
arXiv preprint arXiv:2309.01219, 2023
6382023
Exploiting cross-sentence context for neural machine translation
L Wang, Z Tu, A Way, Q Liu
arXiv preprint arXiv:1704.04347, 2017
2322017
Convolutional self-attention networks
B Yang, L Wang, D Wong, LS Chao, Z Tu
arXiv preprint arXiv:1904.03107, 2019
1452019
UM-Corpus: A Large English-Chinese Parallel Corpus for Statistical Machine Translation.
L Tian, DF Wong, LS Chao, P Quaresma, F Oliveira, L Yi, S Li, Y Wang, ...
LREC, 1837-1842, 2014
1372014
Macaw-llm: Multi-modal language modeling with image, audio, video, and text integration
C Lyu, M Wu, L Wang, X Huang, B Liu, Z Du, S Shi, Z Tu
arXiv preprint arXiv:2306.09093, 2023
1202023
Document-level machine translation with large language models
L Wang, C Lyu, T Ji, Z Zhang, D Yu, S Shi, Z Tu
arXiv preprint arXiv:2304.02210, 2023
1202023
Understanding and improving lexical choice in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2012.14583, 2020
1162020
Modeling recurrence for transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
arXiv preprint arXiv:1904.03092, 2019
922019
Self-attention with cross-lingual position representation
L Ding, L Wang, D Tao
arXiv preprint arXiv:2004.13310, 2020
792020
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
arXiv preprint arXiv:1909.00383, 2019
782019
Context-aware cross-attention for non-autoregressive translation
L Ding, L Wang, D Wu, D Tao, Z Tu
arXiv preprint arXiv:2011.00770, 2020
712020
Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.00903, 2021
642021
Redistributing low-frequency words: Making the most of monolingual data in non-autoregressive translation
L Ding, L Wang, S Shi, D Tao, Z Tu
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
602022
New trends in machine translation using large language models: Case examples with chatgpt
C Lyu, J Xu, L Wang
arXiv preprint arXiv:2305.01181, 2023
54*2023
Dynamic layer aggregation for neural machine translation with routing-by-agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 86-93, 2019
532019
Progressive multi-granularity training for non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.05546, 2021
502021
Translating pro-drop languages with reconstruction models
L Wang, Z Tu, S Shi, T Zhang, Y Graham, Q Liu
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
492018
A novel approach to dropped pronoun translation
L Wang, Z Tu, X Zhang, H Li, A Way, Q Liu
arXiv preprint arXiv:1604.06285, 2016
482016
Towards understanding neural machine translation with word importance
S He, Z Tu, X Wang, L Wang, MR Lyu, S Shi
arXiv preprint arXiv:1909.00326, 2019
452019
Assessing the ability of self-attention networks to learn word order
B Yang, L Wang, DF Wong, LS Chao, Z Tu
arXiv preprint arXiv:1906.00592, 2019
422019
系统目前无法执行此操作,请稍后再试。
文章 1–20