关注
Chenlong Deng
Chenlong Deng
在 ruc.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Large language models for information retrieval: A survey
Y Zhu, H Yuan, S Wang, J Liu, W Liu, C Deng, H Chen, Z Liu, Z Dou, ...
arXiv preprint arXiv:2308.07107, 2023
2782023
Improving personalized search with dual-feedback network
C Deng, Y Zhou, Z Dou
Proceedings of the fifteenth ACM international conference on web search and …, 2022
112022
ChatRetriever: Adapting Large Language Models for Generalized and Robust Conversational Dense Retrieval
K Mao, C Deng, H Chen, F Mo, Z Liu, T Sakai, Z Dou
arXiv preprint arXiv:2404.13556, 2024
102024
An element is worth a thousand words: Enhancing legal case retrieval by incorporating legal elements
C Deng, Z Dou, Y Zhou, P Zhang, K Mao
Findings of the Association for Computational Linguistics ACL 2024, 2354-2365, 2024
42024
Enabling discriminative reasoning in llms for legal judgment prediction
C Deng, K Mao, Y Zhang, Z Dou
arXiv preprint arXiv:2407.01964, 2024
3*2024
RAG-Studio: Towards In-Domain Adaptation of Retrieval Augmented Generation Through Self-Alignment
K Mao, Z Liu, H Qian, F Mo, C Deng, Z Dou
Findings of the Association for Computational Linguistics: EMNLP 2024, 725-735, 2024
12024
Learning Interpretable Legal Case Retrieval via Knowledge-Guided Case Reformulation
C Deng, K Mao, Z Dou
arXiv preprint arXiv:2406.19760, 2024
12024
A Silver Bullet or a Compromise for Full Attention? A Comprehensive Study of Gist Token-based Context Compression
C Deng, Z Zhang, K Mao, S Li, X Huang, D Yu, Z Dou
arXiv preprint arXiv:2412.17483, 2024
2024
Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models
Z Zhang, Y Wang, X Huang, T Fang, H Zhang, C Deng, S Li, D Yu
arXiv preprint arXiv:2412.16545, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–9