关注
Maha Elbayad
Maha Elbayad
Research scientist, Meta AI
在 fb.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
No language left behind: Scaling human-centered machine translation
NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ...
arXiv preprint arXiv:2207.04672, 2022
474*2022
Depth-adaptive transformer
M Elbayad, J Gu, E Grave, M Auli
arXiv preprint arXiv:1910.10073, 2019
1442019
Pervasive attention: 2D convolutional neural networks for sequence-to-sequence prediction
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:1808.03867, 2018
1132018
Findings of the IWSLT 2022 Evaluation Campaign.
A Anastasopoulos, L Barrault, L Bentivogli, MZ Boito, O Bojar, R Cattoni, ...
Proceedings of the 19th International Conference on Spoken Language …, 2022
942022
Efficient wait-k models for simultaneous machine translation
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:2005.08595, 2020
602020
SeamlessM4T - Massively Multilingual & Multimodal Machine Translation
L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, PA Duquenne, ...
arXiv preprint arXiv:2308.11596, 2023
452023
Seamless: Multilingual Expressive and Streaming Speech Translation
L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, M Duppenthaler, ...
arXiv preprint arXiv:2312.05187, 2023
262023
Token-level and sequence-level loss smoothing for RNN language models
M Elbayad, L Besacier, J Verbeek
arXiv preprint arXiv:1805.05062, 2018
242018
Causes and cures for interference in multilingual translation
U Shaham, M Elbayad, V Goswami, O Levy, S Bhosale
arXiv preprint arXiv:2212.07530, 2022
132022
No language left behind: Scaling human-centered machine translation (2022)
NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ...
URL https://arxiv. org/abs/2207.04672, 2022
102022
Online versus offline NMT quality: An in-depth analysis on English-German and German-English
M Elbayad, M Ustaszewski, E Esperança-Rodier, FB Manquat, J Verbeek, ...
arXiv preprint arXiv:2006.00814, 2020
102020
On-trac consortium for end-to-end and simultaneous speech translation challenge tasks at iwslt 2020
M Elbayad, H Nguyen, F Bougares, N Tomashenko, A Caubrière, ...
arXiv preprint arXiv:2005.11861, 2020
92020
Fixing MoE over-fitting on low-resource languages in multilingual machine translation
M Elbayad, A Sun, S Bhosale
arXiv preprint arXiv:2212.07571, 2022
52022
Spirit-lm: Interleaved spoken and written language model
TA Nguyen, B Muller, B Yu, MR Costa-Jussa, M Elbayad, S Popuri, ...
arXiv preprint arXiv:2402.05755, 2024
32024
Merging text transformer models from different initializations
N Verma, M Elbayad
arXiv preprint arXiv:2403.00986, 2024
22024
Towards being parameter-efficient: A stratified sparsely activated transformer with dynamic capacity
H Xu, M Elbayad, K Murray, J Maillard, V Goswami
arXiv preprint arXiv:2305.02176, 2023
22023
Efficiently upgrading multilingual machine translation models to support more languages
S Sun, M Elbayad, A Sun, J Cross
arXiv preprint arXiv:2302.03528, 2023
22023
Improved Training Techniques for Online Neural Machine Translation
M Elbayad, L Besacier, J Verbeek
22020
Proceedings of the Second Workshop on Automatic Simultaneous Translation
H Wu, C Cherry, L Huang, Z He, Q Liu, M Elbayad, M Liberman, H Wang, ...
Proceedings of the Second Workshop on Automatic Simultaneous Translation, 2021
12021
Rethinking the Design of Sequence-to-Sequence Models for Efficient Machine Translation
M Elbayad
Université Grenoble Alpes [2020-....], 2020
12020
系统目前无法执行此操作,请稍后再试。
文章 1–20