[HTML][HTML] Self-supervised contrastive graph representation with node and graph augmentation
Graph representation is a critical technology in the field of knowledge engineering and
knowledge-based applications since most knowledge bases are represented in the graph …
knowledge-based applications since most knowledge bases are represented in the graph …
Rare: Robust masked graph autoencoder
Masked graph autoencoder (MGAE) has emerged as a promising self-supervised graph pre-
training (SGP) paradigm due to its simplicity and effectiveness. However, existing efforts …
training (SGP) paradigm due to its simplicity and effectiveness. However, existing efforts …
[HTML][HTML] Contextual features online prediction for self-supervised graph representation
Self-supervised graph representation Learning (SSGRL) is an emerging technique for
machine learning-based expert applications. SSGRL can effectively encode unlabeled data …
machine learning-based expert applications. SSGRL can effectively encode unlabeled data …
Rethinking Graph Masked Autoencoders through Alignment and Uniformity
Self-supervised learning on graphs can be bifurcated into contrastive and generative
methods. Contrastive methods, also known as graph contrastive learning (GCL), have …
methods. Contrastive methods, also known as graph contrastive learning (GCL), have …
A review of graph neural networks and pretrained language models for knowledge graph reasoning
J Ma, B Liu, K Li, C Li, F Zhang, X Luo, Y Qiao - Neurocomputing, 2024 - Elsevier
Abstract Knowledge Graph (KG) stores human knowledge facts in an intuitive graphical
structure but faces challenges such as incomplete construction or inability to handle new …
structure but faces challenges such as incomplete construction or inability to handle new …
Enhancing graph neural networks with structure-based prompt
Graph Neural Networks (GNNs) are powerful in learning semantics of graph data. Recently,
a new paradigm" pre-train, prompt" has shown promising results in adapting GNNs to …
a new paradigm" pre-train, prompt" has shown promising results in adapting GNNs to …
Prompt tuning for multi-view graph contrastive learning
In recent years," pre-training and fine-tuning" has emerged as a promising approach in
addressing the issues of label dependency and poor generalization performance in …
addressing the issues of label dependency and poor generalization performance in …
Masked Graph Autoencoder with Non-discrete Bandwidths
Masked graph autoencoders have emerged as a powerful graph self-supervised learning
method that has yet to be fully explored. In this paper, we unveil that the existing discrete …
method that has yet to be fully explored. In this paper, we unveil that the existing discrete …
Generative and contrastive paradigms are complementary for graph self-supervised learning
For graph self-supervised learning (GSSL), masked autoencoder (MAE) follows the
generative paradigm and learns to reconstruct masked graph edges or node features while …
generative paradigm and learns to reconstruct masked graph edges or node features while …
PSP: Pre-training and Structure Prompt Tuning for Graph Neural Networks
Abstract Graph Neural Networks (GNNs) are powerful in learning semantics of graph data.
Recently, a new paradigm “pre-train & prompt” has shown promising results in adapting …
Recently, a new paradigm “pre-train & prompt” has shown promising results in adapting …