关注
Tianyi Tang
Tianyi Tang
其他姓名唐天一
在 ruc.edu.cn 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
A survey of large language models
WX Zhao, K Zhou, J Li, T Tang, X Wang, Y Hou, Y Min, B Zhang, J Zhang, ...
arXiv preprint arXiv:2303.18223, 2023
2044*2023
A survey of pretrained language models based text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2201.05273, 2022
273*2022
Not all languages are created equal in llms: Improving multilingual capability by cross-lingual-thought prompting
H Huang, T Tang, D Zhang, WX Zhao, T Song, Y Xia, F Wei
arXiv preprint arXiv:2305.07004, 2023
492023
Few-shot knowledge graph-to-text generation with pretrained language models
J Li, T Tang, WX Zhao, Z Wei, NJ Yuan, JR Wen
arXiv preprint arXiv:2106.01623, 2021
492021
A survey on long text modeling with transformers
Z Dong, T Tang, L Li, WX Zhao
arXiv preprint arXiv:2302.14502, 2023
362023
Textbox 2.0: A text generation library with pre-trained language models
T Tang, J Li, Z Chen, Y Hu, Z Yu, W Dai, Z Dong, X Cheng, Y Wang, ...
arXiv preprint arXiv:2212.13005, 2022
33*2022
Learning to transfer prompts for text generation
J Li, T Tang, JY Nie, JR Wen, WX Zhao
arXiv preprint arXiv:2205.01543, 2022
302022
Mvp: Multi-task supervised pre-training for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2206.12131, 2022
272022
Context-tuning: Learning contextualized prompts for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2201.08670, 2022
222022
ELMER: A non-autoregressive pre-trained language model for efficient and effective text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2210.13304, 2022
132022
Bamboo: A comprehensive benchmark for evaluating long text modeling capacities of large language models
Z Dong, T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2309.13345, 2023
122023
Not All Metrics Are Guilty: Improving NLG Evaluation by Diversifying References
T Tang, H Lu, Y Jiang, H Huang, D Zhang, WX Zhao, T Kocmi, F Wei
Proceedings of the 2024 Conference of the North American Chapter of the …, 2024
11*2024
Beyond imitation: Leveraging fine-grained quality signals for alignment
G Guo, R Zhao, T Tang, WX Zhao, JR Wen
arXiv preprint arXiv:2311.04072, 2023
92023
The Web Can Be Your Oyster for Improving Large Language Models
J Li, T Tang, WX Zhao, J Wang, JY Nie, JR Wen
arXiv preprint arXiv:2305.10998, 2023
8*2023
Learning to imagine: Visually-augmented natural language generation
T Tang, Y Chen, Y Du, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2305.16944, 2023
72023
Zero-shot visual question answering with language model feedback
Y Du, J Li, T Tang, WX Zhao, JR Wen
arXiv preprint arXiv:2305.17006, 2023
52023
ElitePLM: An empirical study on general language ability evaluation of pretrained language models
J Li, T Tang, Z Gong, L Yang, Z Yu, Z Chen, J Wang, WX Zhao, JR Wen
arXiv preprint arXiv:2205.01523, 2022
52022
Towards effective ancient chinese translation: Dataset, model, and evaluation
G Guo, J Yang, F Lu, J Qin, T Tang, WX Zhao
CCF International Conference on Natural Language Processing and Chinese …, 2023
32023
Language-Specific Neurons: The Key to Multilingual Capabilities in Large Language Models
T Tang, W Luo, H Huang, D Zhang, X Wang, X Zhao, F Wei, JR Wen
arXiv preprint arXiv:2402.16438, 2024
22024
Generating Long and Coherent Text with Multi-Level Generative Adversarial Networks
T Tang, J Li, WX Zhao, JR Wen
Web and Big Data: 5th International Joint Conference, APWeb-WAIM 2021 …, 2021
12021
系统目前无法执行此操作,请稍后再试。
文章 1–20