Wizardlm: Empowering large language models to follow complex instructions C Xu, Q Sun, K Zheng, X Geng, P Zhao, J Feng, C Tao, D Jiang ICLR-2024, 2023 | 433 | 2023 |
Wizardcoder: Empowering code large language models with evol-instruct Z Luo, C Xu, P Zhao, Q Sun, X Geng, W Hu, C Tao, J Ma, Q Lin, D Jiang ICLR-2024, 2023 | 262 | 2023 |
Knowledge-grounded dialogue generation with pre-trained language models X Zhao, W Wu, C Xu, C Tao, D Zhao, R Yan EMNLP 2020, 2020 | 201 | 2020 |
Wizardmath: Empowering mathematical reasoning for large language models via reinforced evol-instruct H Luo, Q Sun, C Xu, P Zhao, J Lou, C Tao, X Geng, Q Lin, S Chen, ... arXiv preprint arXiv:2308.09583, 2023 | 151 | 2023 |
Multi-representation fusion network for multi-turn response selection in retrieval-based chatbots C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan Proceedings of the twelfth ACM international conference on web search and …, 2019 | 149 | 2019 |
One time of interaction may not be enough: Go deep with an interaction-over-interaction network for response selection in dialogues C Tao, W Wu, C Xu, W Hu, D Zhao, R Yan Proceedings of the 57th annual meeting of the association for computational …, 2019 | 132 | 2019 |
Low-Resource Knowledge-Grounded Dialogue Generation X Zhao, W Wu, C Tao, C Xu, D Zhao, R Yan ICLR-2020, 2020 | 107 | 2020 |
A sequential matching framework for multi-turn response selection in retrieval-based chatbots Y Wu, W Wu, C Xing, C Xu, Z Li, M Zhou Computational Linguistics 45 (1), 163-197, 2019 | 94 | 2019 |
Neural response generation with dynamic vocabularies Y Wu, W Wu, D Yang, C Xu, Z Li, M Zhou AAAI-2018, 2017 | 80 | 2017 |
Phi-3 technical report: A highly capable language model locally on your phone M Abdin, SA Jacobs, AA Awan, J Aneja, A Awadallah, H Awadalla, ... arXiv preprint arXiv:2404.14219, 2024 | 74 | 2024 |
PromDA: Prompt-based Data Augmentation for Low-Resource NLU Tasks Y Wang, C Xu, Q Sun, H Hu, C Tao, X Geng, D Jiang ACL 2022, 2022 | 73 | 2022 |
Zero-Resource Knowledge-Grounded Dialogue Generation L Li, C Xu, W Wu, Y Zhao, X Zhao, C Tao NeurIPS 2020, 2020 | 73 | 2020 |
ProphetNet-X: Large-Scale Pre-training Models for English, Chinese, Multi-lingual, Dialog, and Code Generation W Qi, Y Gong, Y Yan, C Xu, B Yao, B Zhou, B Cheng, D Jiang, J Chen, ... Demo of ACL 2021, 2021 | 57 | 2021 |
A document-grounded matching network for response selection in retrieval-based chatbots X Zhao, C Tao, W Wu, C Xu, D Zhao, R Yan IJCAI-2019, 2019 | 43 | 2019 |
Knowledge enhanced hybrid neural network for text matching Y Wu, W Wu, X Can, Z Li, M Zhou AAAI-2018, 2016 | 43 | 2016 |
MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding JC Gu, C Tao, ZH Ling, C Xu, X Geng, D Jiang ACL 2021, 2021 | 41 | 2021 |
Neural Response Generation with Meta-Words C Xu, W Wu, C Tao, H Hu, M Schuerman, Y Wang Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019 | 40 | 2019 |
Learning Neural Templates for Recommender Dialogue System Z Liang, H Hu, C Xu, J Miao, Y He, Y Chen, X Geng, F Liang, D Jiang EMNLP 2021, 2021 | 39 | 2021 |
Multimodal Dialogue Response Generation Q Sun, Y Wang, C Xu, K Zheng, Y Yang, H Hu, F Xu, J Zhang, X Geng, ... ACL 2022, 2021 | 34 | 2021 |
Towards explainable and controllable open domain dialogue generation with dialogue acts C Xu, W Wu, Y Wu arXiv preprint arXiv:1807.07255, 2018 | 34 | 2018 |