Distributed graph neural network training: A survey
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …
graphs and have been successfully applied in various domains. Despite the effectiveness of …
Sancus: staleness-aware communication-avoiding full-graph decentralized training in large-scale graph neural networks
Graph neural networks (GNNs) have emerged due to their success at modeling graph data.
Yet, it is challenging for GNNs to efficiently scale to large graphs. Thus, distributed GNNs …
Yet, it is challenging for GNNs to efficiently scale to large graphs. Thus, distributed GNNs …
Parallel and distributed graph neural networks: An in-depth concurrency analysis
Graph neural networks (GNNs) are among the most powerful tools in deep learning. They
routinely solve complex problems on unstructured networks, such as node classification …
routinely solve complex problems on unstructured networks, such as node classification …
Neutronstar: distributed GNN training with hybrid dependency management
GNN's training needs to resolve issues of vertex dependencies, ie, each vertex
representation's update depends on its neighbors. Existing distributed GNN systems adopt …
representation's update depends on its neighbors. Existing distributed GNN systems adopt …
Mariusgnn: Resource-efficient out-of-core training of graph neural networks
We study training of Graph Neural Networks (GNNs) for large-scale graphs. We revisit the
premise of using distributed training for billion-scale graphs and show that for graphs that fit …
premise of using distributed training for billion-scale graphs and show that for graphs that fit …
DUCATI: A dual-cache training system for graph neural networks on giant graphs with the GPU
Recently Graph Neural Networks (GNNs) have achieved great success in many
applications. The mini-batch training has become the de-facto way to train GNNs on giant …
applications. The mini-batch training has become the de-facto way to train GNNs on giant …
Wholegraph: A fast graph neural network training framework with multi-gpu distributed shared memory architecture
D Yang, J Liu, J Qi, J Lai - SC22: International Conference for …, 2022 - ieeexplore.ieee.org
Graph neural networks (GNNs) are prevalent to deal with graph-structured datasets,
encoding graph data into low dimensional vectors. In this paper, we present a fast training …
encoding graph data into low dimensional vectors. In this paper, we present a fast training …
Relhd: A graph-based learning on fefet with hyperdimensional computing
Advances in graph neural network (GNN)-based algorithms enable machine learning on
relational data. GNNs are computationally demanding since they rely upon backpropagation …
relational data. GNNs are computationally demanding since they rely upon backpropagation …
Hyperscale FPGA-as-a-service architecture for large-scale distributed graph neural network
Graph neural network (GNN) is a promising emerging application for link prediction,
recommendation, etc. Existing hardware innovation is limited to single-machine GNN (SM …
recommendation, etc. Existing hardware innovation is limited to single-machine GNN (SM …
Fast and efficient model serving using multi-GPUs with direct-host-access
As deep learning (DL) inference has been widely adopted for building user-facing
applications in many domains, it is increasingly important for DL inference servers to …
applications in many domains, it is increasingly important for DL inference servers to …