No language left behind: Scaling human-centered machine translation NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ... arXiv preprint arXiv:2207.04672, 2022 | 474* | 2022 |
Depth-adaptive transformer M Elbayad, J Gu, E Grave, M Auli arXiv preprint arXiv:1910.10073, 2019 | 144 | 2019 |
Pervasive attention: 2D convolutional neural networks for sequence-to-sequence prediction M Elbayad, L Besacier, J Verbeek arXiv preprint arXiv:1808.03867, 2018 | 113 | 2018 |
Findings of the IWSLT 2022 Evaluation Campaign. A Anastasopoulos, L Barrault, L Bentivogli, MZ Boito, O Bojar, R Cattoni, ... Proceedings of the 19th International Conference on Spoken Language …, 2022 | 94 | 2022 |
Efficient wait-k models for simultaneous machine translation M Elbayad, L Besacier, J Verbeek arXiv preprint arXiv:2005.08595, 2020 | 60 | 2020 |
SeamlessM4T - Massively Multilingual & Multimodal Machine Translation L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, PA Duquenne, ... arXiv preprint arXiv:2308.11596, 2023 | 45 | 2023 |
Seamless: Multilingual Expressive and Streaming Speech Translation L Barrault, YA Chung, MC Meglioli, D Dale, N Dong, M Duppenthaler, ... arXiv preprint arXiv:2312.05187, 2023 | 26 | 2023 |
Token-level and sequence-level loss smoothing for RNN language models M Elbayad, L Besacier, J Verbeek arXiv preprint arXiv:1805.05062, 2018 | 24 | 2018 |
Causes and cures for interference in multilingual translation U Shaham, M Elbayad, V Goswami, O Levy, S Bhosale arXiv preprint arXiv:2212.07530, 2022 | 13 | 2022 |
No language left behind: Scaling human-centered machine translation (2022) NLLB Team, MR Costa-jussà, J Cross, O Çelebi, M Elbayad, K Heafield, ... URL https://arxiv. org/abs/2207.04672, 2022 | 10 | 2022 |
Online versus offline NMT quality: An in-depth analysis on English-German and German-English M Elbayad, M Ustaszewski, E Esperança-Rodier, FB Manquat, J Verbeek, ... arXiv preprint arXiv:2006.00814, 2020 | 10 | 2020 |
On-trac consortium for end-to-end and simultaneous speech translation challenge tasks at iwslt 2020 M Elbayad, H Nguyen, F Bougares, N Tomashenko, A Caubrière, ... arXiv preprint arXiv:2005.11861, 2020 | 9 | 2020 |
Fixing MoE over-fitting on low-resource languages in multilingual machine translation M Elbayad, A Sun, S Bhosale arXiv preprint arXiv:2212.07571, 2022 | 5 | 2022 |
Spirit-lm: Interleaved spoken and written language model TA Nguyen, B Muller, B Yu, MR Costa-Jussa, M Elbayad, S Popuri, ... arXiv preprint arXiv:2402.05755, 2024 | 3 | 2024 |
Merging text transformer models from different initializations N Verma, M Elbayad arXiv preprint arXiv:2403.00986, 2024 | 2 | 2024 |
Towards being parameter-efficient: A stratified sparsely activated transformer with dynamic capacity H Xu, M Elbayad, K Murray, J Maillard, V Goswami arXiv preprint arXiv:2305.02176, 2023 | 2 | 2023 |
Efficiently upgrading multilingual machine translation models to support more languages S Sun, M Elbayad, A Sun, J Cross arXiv preprint arXiv:2302.03528, 2023 | 2 | 2023 |
Improved Training Techniques for Online Neural Machine Translation M Elbayad, L Besacier, J Verbeek | 2 | 2020 |
Proceedings of the Second Workshop on Automatic Simultaneous Translation H Wu, C Cherry, L Huang, Z He, Q Liu, M Elbayad, M Liberman, H Wang, ... Proceedings of the Second Workshop on Automatic Simultaneous Translation, 2021 | 1 | 2021 |
Rethinking the Design of Sequence-to-Sequence Models for Efficient Machine Translation M Elbayad Université Grenoble Alpes [2020-....], 2020 | 1 | 2020 |