Unified language model pre-training for natural language understanding and generation L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon Advances in neural information processing systems 32, 2019 | 1653 | 2019 |
Unilmv2: Pseudo-masked language models for unified language model pre-training H Bao, L Dong, F Wei, W Wang, N Yang, X Liu, Y Wang, J Gao, S Piao, ... International conference on machine learning, 642-652, 2020 | 390 | 2020 |
Adversarial training for large neural language models X Liu, H Cheng, P He, W Chen, Y Wang, H Poon, J Gao arXiv preprint arXiv:2004.08994, 2020 | 171 | 2020 |
The microsoft toolkit of multi-task deep neural networks for natural language understanding X Liu, Y Wang, J Ji, H Cheng, X Zhu, E Awa, P He, W Chen, H Poon, ... arXiv preprint arXiv:2002.07972, 2020 | 53 | 2020 |