关注
Yongyu Mu
Yongyu Mu
在 stu.neu.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
The NiuTrans machine translation systems for WMT20
Y Zhang, Z Wang, R Cao, B Wei, W Shan, S Zhou, A Reheman, T Zhou, ...
Proceedings of the Fifth Conference on Machine Translation, 338-345, 2020
172020
Augmenting large language model translators via translation memories
Y Mu, A Reheman, Z Cao, Y Fan, B Li, Y Li, T Xiao, C Zhang, J Zhu
arXiv preprint arXiv:2305.17367, 2023
152023
The NiuTrans machine translation systems for WMT21
S Zhou, T Zhou, B Wei, Y Luo, Y Mu, Z Zhou, C Wang, X Zhou, C Lv, ...
arXiv preprint arXiv:2109.10485, 2021
42021
Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection
C Wang, Y Lu, Y Mu, Y Hu, T Xiao, J Zhu
arXiv preprint arXiv:2302.00444, 2023
32023
Hybrid Alignment Training for Large Language Models
C Wang, H Zhou, K Chang, B Li, Y Mu, T Xiao, T Liu, J Zhu
arXiv preprint arXiv:2406.15178, 2024
12024
Large Language Models are Parallel Multilingual Learners
Y Mu, P Feng, Z Cao, Y Wu, B Li, C Wang, T Xiao, K Song, T Liu, C Zhang, ...
arXiv preprint arXiv:2403.09073, 2024
12024
The NiuTrans System for the WMT21 Efficiency Task
C Wang, C Hu, Y Mu, Z Yan, S Wu, M Hu, H Cao, B Li, Y Lin, T Xiao, J Zhu
arXiv preprint arXiv:2109.08003, 2021
12021
RoVRM: A Robust Visual Reward Model Optimized via Auxiliary Textual Preference Data
C Wang, Y Gan, Y Huo, Y Mu, M Yang, Q He, T Xiao, C Zhang, T Liu, ...
arXiv preprint arXiv:2408.12109, 2024
2024
Cross-layer Attention Sharing for Large Language Models
Y Mu, Y Wu, Y Fan, C Wang, H Li, Q He, M Yang, T Xiao, J Zhu
arXiv preprint arXiv:2408.01890, 2024
2024
Translate-and-Revise: Boosting Large Language Models for Constrained Translation
P Huang, Y Mu, Y Wu, B Li, C Xiao, T Xiao, J Zhu
arXiv preprint arXiv:2407.13164, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–10