赞
踩
Graph sampling methods generally select partial nodes in one node’s neighborhood or one layer in a graph to acquire subgraphs for subsequent training, which makes efficient computation for model training.
Node-wise sampling methods
Layer-wise sampling methods
Subgraph-based sampling methods
Heterogeneous sampling methods
Graph sparsification methods typically remove task-irrelevant edges in a graph by designing a specific optimization goal.
Heuristic sparsification
Learnable module for sparsification
Graph partition has been an NP-hard problem with the goal of reducing the original graph to smaller subgraphs that are constructed by mutually exclusive node groups.
Graph partition for generic GNN training
Graph partition for sampling-based GNN training
The forward propagation in the
l
l
l-th layer in GCN training can be formulated as:
H
l
=
σ
(
D
~
−
1
/
2
A
~
D
~
−
1
/
2
H
l
−
1
W
l
−
1
)
,
\textbf{H}^l=\sigma(\tilde{\textbf{D}}^{-1/2}\tilde{\textbf{A}}\tilde{\textbf{D}}^{-1/2}\textbf{H}^{l-1}\textbf{W}^{l-1}),
Hl=σ(D~−1/2A~D~−1/2Hl−1Wl−1),in which linear aggregation of neighboring information and nonlinear activation are combined to update node representations. However, recent literature has argued that the efficiency of GNNs can be further promoted by simplifying redundant operations.
Generic simplified GNN
Special simplified GNN with tasks related
Model compression is a technique that compresses a complicated model, such as a DNN, to a lightweight one with typically fewer parameters preserved.
Model quantification
Knowledge distillation
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。