关注
Weize Chen
Weize Chen
在 mails.tsinghua.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Parameter-efficient fine-tuning of large-scale pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence 5 (3), 220-235, 2023
3032023
Communicative agents for software development
C Qian, X Cong, C Yang, W Chen, Y Su, J Xu, Z Liu, M Sun
arXiv preprint arXiv:2307.07924, 2023
2342023
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
arXiv preprint arXiv:2203.06904, 2022
1822022
Chateval: Towards better llm-based evaluators through multi-agent debate
CM Chan, W Chen, Y Su, J Yu, W Xue, S Zhang, J Fu, Z Liu
arXiv preprint arXiv:2308.07201, 2023
1422023
Agentverse: Facilitating multi-agent collaboration and exploring emergent behaviors in agents
W Chen, Y Su, J Zuo, C Yang, C Yuan, C Qian, CM Chan, Y Qin, Y Lu, ...
arXiv preprint arXiv:2308.10848, 2023
992023
Fully hyperbolic neural networks
W Chen, X Han, Y Lin, H Zhao, Z Liu, P Li, M Sun, J Zhou
arXiv preprint arXiv:2105.14686, 2021
782021
Agentverse: Facilitating multi-agent collaboration and exploring emergent behaviors
W Chen, Y Su, J Zuo, C Yang, C Yuan, CM Chan, H Yu, Y Lu, YH Hung, ...
The Twelfth International Conference on Learning Representations, 2023
332023
Exploring low-dimensional intrinsic task subspace via prompt tuning
Y Qin, X Wang, Y Su, Y Lin, N Ding, Z Liu, J Li, L Hou, P Li, M Sun, J Zhou
arXiv preprint arXiv:2110.07867, 2021
332021
Gact: Activation compressed training for generic network architectures
X Liu, L Zheng, D Wang, Y Cen, W Chen, X Han, J Chen, Z Liu, J Tang, ...
International Conference on Machine Learning, 14139-14152, 2022
222022
Exploring mode connectivity for pre-trained language models
Y Qin, C Qian, J Yi, W Chen, Y Lin, X Han, Z Liu, M Sun, J Zhou
arXiv preprint arXiv:2210.14102, 2022
152022
Exploring universal intrinsic task subspace via prompt tuning
Y Qin, X Wang, Y Su, Y Lin, N Ding, J Yi, W Chen, Z Liu, J Li, L Hou, P Li, ...
arXiv preprint arXiv:2110.07867, 2021
112021
Cross-lingual contrastive learning for fine-grained entity typing for low-resource languages
X Han, Y Luo, W Chen, Z Liu, M Sun, Z Botong, H Fei, S Zheng
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
102022
Quantifying similarity between relations with fact distribution
W Chen, H Zhu, X Han, Z Liu, M Sun
arXiv preprint arXiv:1907.08937, 2019
92019
Experiential co-learning of software-developing agents
C Qian, Y Dang, J Li, W Liu, W Chen, C Yang, Z Liu, M Sun
arXiv preprint arXiv:2312.17025, 2023
82023
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models. CoRR, abs/2203.06904, 2022. doi: 10.48550
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
arXiv preprint arXiv.2203.06904, 0
7
D-bot: Database diagnosis system using large language models
X Zhou, G Li, Z Sun, Z Liu, W Chen, J Wu, J Liu, R Feng, G Zeng
arXiv preprint arXiv:2312.01454, 2023
62023
Knowledge representation learning and knowledge-guided NLP
X Han, W Chen, Z Liu, Y Lin, M Sun
Representation Learning for Natural Language Processing, 273, 2023
22023
Different tunes played with equal skill: Exploring a unified optimization subspace for parameter-efficient tuning
J Yi, W Chen, Y Qin, Y Lin, N Ding, X Han, Z Liu, M Sun, J Zhou
Findings of the Association for Computational Linguistics: EMNLP 2022, 3348-3366, 2022
22022
Autonomous Agents for Collaborative Task under Information Asymmetry
W Liu, C Wang, Y Wang, Z Xie, R Qiu, Y Dang, Z Du, W Chen, C Yang, ...
arXiv preprint arXiv:2406.14928, 2024
12024
Iterative Experience Refinement of Software-Developing Agents
C Qian, J Li, Y Dang, W Liu, YF Wang, Z Xie, W Chen, C Yang, Y Zhang, ...
arXiv preprint arXiv:2405.04219, 2024
12024
系统目前无法执行此操作,请稍后再试。
文章 1–20