Contrastive learning for fair graph representations via counterfactual graph augmentation

C Li, D Cheng, G Zhang, S Zhang - Knowledge-Based Systems, 2024 - Elsevier
Graph neural networks (GNNs) have exhibited excellent performance in graph
representation learning. However, GNNs might inherit biases from the data, leading to …

Masked Motion Prediction with Semantic Contrast for Point Cloud Sequence Learning

Y Han, C Xu, R Xu, J Qian, J Xie - European Conference on Computer …, 2025 - Springer
Self-supervised representation learning on point cloud sequences is a challenging task due
to the complex spatio-temporal structure. Most recent attempts aim to train the point cloud …

Where to Mask: Structure-Guided Masking for Graph Masked Autoencoders

C Liu, Y Wang, Y Zhan, X Ma, D Tao, J Wu… - arXiv preprint arXiv …, 2024 - arxiv.org
Graph masked autoencoders (GMAE) have emerged as a significant advancement in self-
supervised pre-training for graph-structured data. Previous GMAE models primarily utilize a …

A Survey on Self-Supervised Pre-Training of Graph Foundation Models: A Knowledge-Based Perspective

Z Zhao, Y Li, Y Zou, R Li, R Zhang - arXiv preprint arXiv:2403.16137, 2024 - arxiv.org
Graph self-supervised learning is now a go-to method for pre-training graph foundation
models, including graph neural networks, graph transformers, and more recent large …

A deep contrastive framework for unsupervised temporal link prediction in dynamic networks

P Jiao, X Zhang, Z Liu, L Zhang, H Wu, M Gao, T Li… - Information …, 2024 - Elsevier
In dynamic networks, temporal link prediction aims to predict the appearance and
disappearance of links in future snapshots based on the network structure we have …

CYCLE: Cross-Year Contrastive Learning in Entity-Linking

P Zhang, C Cao, K Zaporojets, P Groth - Proceedings of the 33rd ACM …, 2024 - dl.acm.org
Knowledge graphs constantly evolve with new entities emerging, existing definitions being
revised, and entity relationships changing. These changes lead to temporal degradation in …

Do spectral cues matter in contrast-based graph self-supervised learning?

X Jian, X Zhao, W Pang, C Ying, Y Wang, Y Xu… - arXiv preprint arXiv …, 2024 - arxiv.org
The recent surge in contrast-based graph self-supervised learning has prominently featured
an intensified exploration of spectral cues. However, an intriguing paradox emerges, as …

[PDF][PDF] Towards Advanced Unsupervised Representation Learning for Graph-Structured Data

Q Sun - 2023 - unsworks.unsw.edu.au
Despite the unprecedented achievement achieved by deep learning on graphs, its success
is predominantly tethered to the quality and quantity of labeled datasets. In response to this …

Are spectral augmentations necessary in contrast-based graph self-supervised learning?

X Jian, X Zhao, W Pang, C Ying, Y Wang, Y Xu, T Yu - openreview.net
The recent surge in contrast-based graph self-supervised learning has prominently featured
an intensified exploration of spectral cues. Spectral augmentation, which involves modifying …