A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends
Deep supervised learning algorithms typically require a large volume of labeled data to
achieve satisfactory performance. However, the process of collecting and labeling such data …
achieve satisfactory performance. However, the process of collecting and labeling such data …
A survey on contrastive self-supervised learning
Self-supervised learning has gained popularity because of its ability to avoid the cost of
annotating large-scale datasets. It is capable of adopting self-defined pseudolabels as …
annotating large-scale datasets. It is capable of adopting self-defined pseudolabels as …
Improving graph collaborative filtering with neighborhood-enriched contrastive learning
Recently, graph collaborative filtering methods have been proposed as an effective
recommendation approach, which can capture users' preference over items by modeling the …
recommendation approach, which can capture users' preference over items by modeling the …
Text and code embeddings by contrastive pre-training
Text embeddings are useful features in many applications such as semantic search and
computing text similarity. Previous work typically trains models customized for different use …
computing text similarity. Previous work typically trains models customized for different use …
Simcse: Simple contrastive learning of sentence embeddings
This paper presents SimCSE, a simple contrastive learning framework that greatly advances
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
Consert: A contrastive framework for self-supervised sentence representation transfer
Learning high-quality sentence representations benefits a wide range of natural language
processing tasks. Though BERT-based pre-trained language models achieve high …
processing tasks. Though BERT-based pre-trained language models achieve high …
DiffCSE: Difference-based contrastive learning for sentence embeddings
We propose DiffCSE, an unsupervised contrastive learning framework for learning sentence
embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference …
embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference …
Unsupervised corpus aware language model pre-training for dense passage retrieval
L Gao, J Callan - arXiv preprint arXiv:2108.05540, 2021 - arxiv.org
Recent research demonstrates the effectiveness of using fine-tuned language models~(LM)
for dense retrieval. However, dense retrievers are hard to train, typically requiring heavily …
for dense retrieval. However, dense retrievers are hard to train, typically requiring heavily …
Clear: Contrastive learning for sentence representation
Pre-trained language models have proven their unique powers in capturing implicit
language features. However, most pre-training approaches focus on the word-level training …
language features. However, most pre-training approaches focus on the word-level training …
Towards unsupervised deep graph structure learning
In recent years, graph neural networks (GNNs) have emerged as a successful tool in a
variety of graph-related applications. However, the performance of GNNs can be …
variety of graph-related applications. However, the performance of GNNs can be …