Dense text retrieval based on pretrained language models: A survey
Text retrieval is a long-standing research topic on information seeking, where a system is
required to return relevant information resources to user's queries in natural language. From …
required to return relevant information resources to user's queries in natural language. From …
Does Negative Sampling Matter? A Review with Insights into its Theory and Applications
Negative sampling has swiftly risen to prominence as a focal point of research, with wide-
ranging applications spanning machine learning, computer vision, natural language …
ranging applications spanning machine learning, computer vision, natural language …
Improving contrastive learning of sentence embeddings from ai feedback
Contrastive learning has become a popular approach in natural language processing,
particularly for the learning of sentence embeddings. However, the discrete nature of natural …
particularly for the learning of sentence embeddings. However, the discrete nature of natural …
mclip: Multilingual clip via cross-lingual transfer
Large-scale vision-language pretrained (VLP) models like CLIP have shown remarkable
performance on various downstream cross-modal tasks. However, they are usually biased …
performance on various downstream cross-modal tasks. However, they are usually biased …
WhitenedCSE: Whitening-based contrastive learning of sentence embeddings
This paper presents a whitening-based contrastive learning method for sentence
embedding learning (WhitenedCSE), which combines contrastive learning with a novel …
embedding learning (WhitenedCSE), which combines contrastive learning with a novel …
Infocse: Information-aggregated contrastive learning of sentence embeddings
Contrastive learning has been extensively studied in sentence embedding learning, which
assumes that the embeddings of different views of the same sentence are closer. The …
assumes that the embeddings of different views of the same sentence are closer. The …
Simans: Simple ambiguous negatives sampling for dense text retrieval
Sampling proper negatives from a large document pool is vital to effectively train a dense
retrieval model. However, existing negative sampling strategies suffer from the uninformative …
retrieval model. However, existing negative sampling strategies suffer from the uninformative …
CLSEP: Contrastive learning of sentence embedding with prompt
Sentence embedding, which aims to learn an effective representation of the sentence, is
beneficial for downstream tasks. By utilizing contrastive learning, most recent sentence …
beneficial for downstream tasks. By utilizing contrastive learning, most recent sentence …
Sncse: Contrastive learning for unsupervised sentence embedding with soft negative samples
H Wang, Y Dou - International Conference on Intelligent Computing, 2023 - Springer
Unsupervised sentence embedding aims to obtain the most appropriate embedding for a
sentence to reflect its semantics. Contrastive learning has been attracting developing …
sentence to reflect its semantics. Contrastive learning has been attracting developing …
Improving sequential model editing with fact retrieval
X Han, R Li, H Tan, W Yuanlong, Q Chai… - Findings of the …, 2023 - aclanthology.org
The task of sequential model editing is to fix erroneous knowledge in Pre-trained Language
Models (PLMs) efficiently, precisely and continuously. Although existing methods can deal …
Models (PLMs) efficiently, precisely and continuously. Although existing methods can deal …