GNN
Reason:
GIN
This is one of my favourite papers.
ref: How powerful are graph neural networks (ICLR2019)
Lemma 2. Let G1 and G2 be any two non-isomorphic graphs. If a graph neural network A : G →
maps G1 and G2 to different embeddings, the Weisfeiler-Lehman graph isomorphism test also decides
G1 and G2 are not isomorphic.
lemma 2 means that upper bound of GNN equals to 1-dim WL test
Theorem 3. Let A : G →
be a GNN. With a sufficient number of GNN layers, A maps any
graphs G1 and G2 that the Weisfeiler-Lehman test of isomorphism decides as non-isomorphic, to
different embeddings if the following conditions hold: all of operations in GNN (aggregate, combine and readout are injective(单射))
based on theorem 3, we can propose a simplest GNN model : GIN, where aggeregate = sum, combine = (1+
GMN
Graph match
(input G1 G2, output similarity score)
Only difference with GNN: when we combine information, we will consider one more term:
i,j different graph,