Pre-train, Prompt, and Recommendation: A Comprehensive Survey of Language Modeling Paradigm Adaptations in Recommender Systems

P Liu, L Zhang, JA Gulla - Transactions of the Association for …, 2023 - direct.mit.edu
The emergence of Pre-trained Language Models (PLMs) has achieved tremendous success
in the field of Natural Language Processing (NLP) by learning universal representations on …

Leveraging large language models for pre-trained recommender systems

Z Chu, H Hao, X Ouyang, S Wang, Y Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Recent advancements in recommendation systems have shifted towards more
comprehensive and personalized recommendations by utilizing large language models …

Enhancing recommender systems with large language model reasoning graphs

Y Wang, Z Chu, X Ouyang, S Wang, H Hao… - arXiv preprint arXiv …, 2023 - arxiv.org
Recommendation systems aim to provide users with relevant suggestions, but often lack
interpretability and fail to capture higher-level semantic relationships between user …

Adaptive Learning on User Segmentation: Universal to Specific Representation via Bipartite Neural Interaction

X Tan, Y Deng, C Qu, S Xue, X Shi, J Zhang… - Proceedings of the …, 2023 - dl.acm.org
Recently, models for user representation learning have been widely applied in click-through-
rate (CTR) and conversion-rate (CVR) prediction. Usually, the model learns a universal user …