Dense text retrieval based on pretrained language models: A survey

WX Zhao, J Liu, R Ren, JR Wen - ACM Transactions on Information …, 2024 - dl.acm.org
Text retrieval is a long-standing research topic on information seeking, where a system is
required to return relevant information resources to user's queries in natural language. From …

Does Negative Sampling Matter? A Review with Insights into its Theory and Applications

Z Yang, M Ding, T Huang, Y Cen, J Song… - … on Pattern Analysis …, 2024 - ieeexplore.ieee.org
Negative sampling has swiftly risen to prominence as a focal point of research, with wide-
ranging applications spanning machine learning, computer vision, natural language …

Improving contrastive learning of sentence embeddings from ai feedback

Q Cheng, X Yang, T Sun, L Li, X Qiu - arXiv preprint arXiv:2305.01918, 2023 - arxiv.org
Contrastive learning has become a popular approach in natural language processing,
particularly for the learning of sentence embeddings. However, the discrete nature of natural …

mclip: Multilingual clip via cross-lingual transfer

G Chen, L Hou, Y Chen, W Dai, L Shang… - Proceedings of the …, 2023 - aclanthology.org
Large-scale vision-language pretrained (VLP) models like CLIP have shown remarkable
performance on various downstream cross-modal tasks. However, they are usually biased …

WhitenedCSE: Whitening-based contrastive learning of sentence embeddings

W Zhuo, Y Sun, X Wang, L Zhu… - Proceedings of the 61st …, 2023 - aclanthology.org
This paper presents a whitening-based contrastive learning method for sentence
embedding learning (WhitenedCSE), which combines contrastive learning with a novel …

Infocse: Information-aggregated contrastive learning of sentence embeddings

X Wu, C Gao, Z Lin, J Han, Z Wang, S Hu - arXiv preprint arXiv …, 2022 - arxiv.org
Contrastive learning has been extensively studied in sentence embedding learning, which
assumes that the embeddings of different views of the same sentence are closer. The …

Simans: Simple ambiguous negatives sampling for dense text retrieval

K Zhou, Y Gong, X Liu, WX Zhao, Y Shen… - arXiv preprint arXiv …, 2022 - arxiv.org
Sampling proper negatives from a large document pool is vital to effectively train a dense
retrieval model. However, existing negative sampling strategies suffer from the uninformative …

CLSEP: Contrastive learning of sentence embedding with prompt

Q Wang, W Zhang, T Lei, Y Cao, D Peng… - Knowledge-Based …, 2023 - Elsevier
Sentence embedding, which aims to learn an effective representation of the sentence, is
beneficial for downstream tasks. By utilizing contrastive learning, most recent sentence …

Sncse: Contrastive learning for unsupervised sentence embedding with soft negative samples

H Wang, Y Dou - International Conference on Intelligent Computing, 2023 - Springer
Unsupervised sentence embedding aims to obtain the most appropriate embedding for a
sentence to reflect its semantics. Contrastive learning has been attracting developing …

Improving sequential model editing with fact retrieval

X Han, R Li, H Tan, W Yuanlong, Q Chai… - Findings of the …, 2023 - aclanthology.org
The task of sequential model editing is to fix erroneous knowledge in Pre-trained Language
Models (PLMs) efficiently, precisely and continuously. Although existing methods can deal …