Publication | Open Access
Sancus
66
Citations
19
References
2022
Year
Cluster ComputingGraph Neural NetworksEngineeringGraph TheoryData ScienceDistributed AlgorithmsEdge ComputingHistorical EmbeddingsGraph Neural NetworkFederated LearningLarge-scale NetworkComputer EngineeringNetwork AnalysisComputer ScienceGraph AnalysisDeep LearningGraph ProcessingGraph Data
Graph neural networks (GNNs) have emerged due to their success at modeling graph data. Yet, it is challenging for GNNs to efficiently scale to large graphs. Thus, distributed GNNs come into play. To avoid communication caused by expensive data movement between workers, we propose Sancus, a staleness-aware communication-avoiding decentralized GNN system. By introducing a set of novel bounded embedding staleness metrics and adaptively skipping broadcasts, Sancus abstracts decentralized GNN processing as sequential matrix multiplication and uses historical embeddings via cache. Theoretically, we show bounded approximation errors of embeddings and gradients with convergence guarantee. Empirically, we evaluate Sancus with common GNN models via different system setups on large-scale benchmark datasets. Compared to SOTA works, Sancus can avoid up to 74% communication with at least 1.86X faster throughput on average without accuracy loss.
| Year | Citations | |
|---|---|---|
Page 1
Page 1