[HTML][HTML] Self-supervised contrastive graph representation with node and graph augmentation

H Duan, C Xie, B Li, P Tang - Neural Networks, 2023 - Elsevier
Graph representation is a critical technology in the field of knowledge engineering and
knowledge-based applications since most knowledge bases are represented in the graph …

Rare: Robust masked graph autoencoder

W Tu, Q Liao, S Zhou, X Peng, C Ma… - … on Knowledge and …, 2023 - ieeexplore.ieee.org
Masked graph autoencoder (MGAE) has emerged as a promising self-supervised graph pre-
training (SGP) paradigm due to its simplicity and effectiveness. However, existing efforts …

[HTML][HTML] Contextual features online prediction for self-supervised graph representation

H Duan, C Xie, P Tang, B Yu - Expert Systems with Applications, 2024 - Elsevier
Self-supervised graph representation Learning (SSGRL) is an emerging technique for
machine learning-based expert applications. SSGRL can effectively encode unlabeled data …

Rethinking Graph Masked Autoencoders through Alignment and Uniformity

L Wang, X Tao, Q Liu, S Wu - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
Self-supervised learning on graphs can be bifurcated into contrastive and generative
methods. Contrastive methods, also known as graph contrastive learning (GCL), have …

A review of graph neural networks and pretrained language models for knowledge graph reasoning

J Ma, B Liu, K Li, C Li, F Zhang, X Luo, Y Qiao - Neurocomputing, 2024 - Elsevier
Abstract Knowledge Graph (KG) stores human knowledge facts in an intuitive graphical
structure but faces challenges such as incomplete construction or inability to handle new …

Enhancing graph neural networks with structure-based prompt

Q Ge, Z Zhao, Y Liu, A Cheng, X Li, S Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Graph Neural Networks (GNNs) are powerful in learning semantics of graph data. Recently,
a new paradigm" pre-train, prompt" has shown promising results in adapting GNNs to …

Prompt tuning for multi-view graph contrastive learning

C Gong, X Li, J Yu, C Yao, J Tan, C Yu… - arXiv preprint arXiv …, 2023 - arxiv.org
In recent years," pre-training and fine-tuning" has emerged as a promising approach in
addressing the issues of label dependency and poor generalization performance in …

Masked Graph Autoencoder with Non-discrete Bandwidths

Z Zhao, Y Li, Y Zou, J Tang, R Li - Proceedings of the ACM on Web …, 2024 - dl.acm.org
Masked graph autoencoders have emerged as a powerful graph self-supervised learning
method that has yet to be fully explored. In this paper, we unveil that the existing discrete …

Generative and contrastive paradigms are complementary for graph self-supervised learning

Y Wang, X Yan, C Hu, Q Xu, C Yang… - 2024 IEEE 40th …, 2024 - ieeexplore.ieee.org
For graph self-supervised learning (GSSL), masked autoencoder (MAE) follows the
generative paradigm and learns to reconstruct masked graph edges or node features while …

PSP: Pre-training and Structure Prompt Tuning for Graph Neural Networks

Q Ge, Z Zhao, Y Liu, A Cheng, X Li, S Wang… - … European Conference on …, 2024 - Springer
Abstract Graph Neural Networks (GNNs) are powerful in learning semantics of graph data.
Recently, a new paradigm “pre-train & prompt” has shown promising results in adapting …