Contrastive learning for fair graph representations via counterfactual graph augmentation
Graph neural networks (GNNs) have exhibited excellent performance in graph
representation learning. However, GNNs might inherit biases from the data, leading to …
representation learning. However, GNNs might inherit biases from the data, leading to …
Masked Motion Prediction with Semantic Contrast for Point Cloud Sequence Learning
Self-supervised representation learning on point cloud sequences is a challenging task due
to the complex spatio-temporal structure. Most recent attempts aim to train the point cloud …
to the complex spatio-temporal structure. Most recent attempts aim to train the point cloud …
Where to Mask: Structure-Guided Masking for Graph Masked Autoencoders
Graph masked autoencoders (GMAE) have emerged as a significant advancement in self-
supervised pre-training for graph-structured data. Previous GMAE models primarily utilize a …
supervised pre-training for graph-structured data. Previous GMAE models primarily utilize a …
A Survey on Self-Supervised Pre-Training of Graph Foundation Models: A Knowledge-Based Perspective
Graph self-supervised learning is now a go-to method for pre-training graph foundation
models, including graph neural networks, graph transformers, and more recent large …
models, including graph neural networks, graph transformers, and more recent large …
A deep contrastive framework for unsupervised temporal link prediction in dynamic networks
In dynamic networks, temporal link prediction aims to predict the appearance and
disappearance of links in future snapshots based on the network structure we have …
disappearance of links in future snapshots based on the network structure we have …
CYCLE: Cross-Year Contrastive Learning in Entity-Linking
Knowledge graphs constantly evolve with new entities emerging, existing definitions being
revised, and entity relationships changing. These changes lead to temporal degradation in …
revised, and entity relationships changing. These changes lead to temporal degradation in …
Do spectral cues matter in contrast-based graph self-supervised learning?
The recent surge in contrast-based graph self-supervised learning has prominently featured
an intensified exploration of spectral cues. However, an intriguing paradox emerges, as …
an intensified exploration of spectral cues. However, an intriguing paradox emerges, as …
[PDF][PDF] Towards Advanced Unsupervised Representation Learning for Graph-Structured Data
Q Sun - 2023 - unsworks.unsw.edu.au
Despite the unprecedented achievement achieved by deep learning on graphs, its success
is predominantly tethered to the quality and quantity of labeled datasets. In response to this …
is predominantly tethered to the quality and quantity of labeled datasets. In response to this …
Are spectral augmentations necessary in contrast-based graph self-supervised learning?
The recent surge in contrast-based graph self-supervised learning has prominently featured
an intensified exploration of spectral cues. Spectral augmentation, which involves modifying …
an intensified exploration of spectral cues. Spectral augmentation, which involves modifying …