An Efficient Self-Supervised Cross-View Training For Sentence Embedding
P Limkonchotiwat, W Ponwitayarat… - Transactions of the …, 2023 - direct.mit.edu
Self-supervised sentence representation learning is the task of constructing an embedding
space for sentences without relying on human annotation efforts. One straightforward …
space for sentences without relying on human annotation efforts. One straightforward …
Dial2vec: Self-guided contrastive learning of unsupervised dialogue embeddings
In this paper, we introduce the task of learning unsupervised dialogue embeddings. Trivial
approaches such as combining pre-trained word or sentence embeddings and encoding …
approaches such as combining pre-trained word or sentence embeddings and encoding …
micse: Mutual information contrastive learning for low-shot sentence embeddings
This paper presents miCSE, a mutual information-based contrastive learning framework that
significantly advances the state-of-the-art in few-shot sentence embedding. The proposed …
significantly advances the state-of-the-art in few-shot sentence embedding. The proposed …
An information minimization based contrastive learning model for unsupervised sentence embeddings learning
Unsupervised sentence embeddings learning has been recently dominated by contrastive
learning methods (eg, SimCSE), which keep positive pairs similar and push negative pairs …
learning methods (eg, SimCSE), which keep positive pairs similar and push negative pairs …
Knowledge Graph‐Based Hierarchical Text Semantic Representation
Y Wu, X Pan, J Li, S Dou, J Dong… - International journal of …, 2024 - Wiley Online Library
Document representation is the basis of language modeling. Its goal is to turn natural
language text that flows into a structured form that can be stored and processed by a …
language text that flows into a structured form that can be stored and processed by a …
[HTML][HTML] Extracting Sentence Embeddings from Pretrained Transformer Models
L Stankevičius, M Lukoševičius - Applied Sciences, 2024 - mdpi.com
Pre-trained transformer models shine in many natural language processing tasks and
therefore are expected to bear the representation of the input sentence or text meaning …
therefore are expected to bear the representation of the input sentence or text meaning …
SimCSE++: Improving contrastive learning for sentence embeddings from two perspectives
This paper improves contrastive learning for sentence embeddings from two perspectives:
handling dropout noise and addressing feature corruption. Specifically, for the first …
handling dropout noise and addressing feature corruption. Specifically, for the first …
Diversified ensemble of independent sub-networks for robust self-supervised representation learning
Ensembling a neural network is a widely recognized approach to enhance model
performance, estimate uncertainty, and improve robustness in deep supervised learning …
performance, estimate uncertainty, and improve robustness in deep supervised learning …
Unsupervised sentence representation learning with frequency-induced adversarial tuning and incomplete sentence filtering
Abstract Pre-trained Language Model (PLM) is nowadays the mainstay of Unsupervised
Sentence Representation Learning (USRL). However, PLMs are sensitive to the frequency …
Sentence Representation Learning (USRL). However, PLMs are sensitive to the frequency …
An Efficient Self-Supervised Cross-View Training For Sentence Embedding
W Ponwitayarat, L Lowphansirikul… - Transactions of the …, 2023 - transacl.org
Self-supervised sentence representation learning is the task of constructing an embedding
space for sentences without relying on human annotation efforts. One straightforward …
space for sentences without relying on human annotation efforts. One straightforward …