0%

CS224W ML in graphs-study notes: node embedding

Encoder maps from nodes to embeddings

Decoder maps from embeddings to the similarity score

In our own task, we need to think of how our embeddings will be used. (In previous work, at leaset in the lecture, the embeddings are used for similarity check.)

Random walk

$z_u^T z_v$ = probability that u and v co-occur on a random walk over the graph

therefore, we maximize the probability of node pairs that is traversed by random walk

node2vec

biasd walk: BFS walk (local microscopic view) + DFS walk (global macroscopic view)

Graph Embedding

  • avergae/sum node embeddings
  • virtual node as the total graph
  • hierarchically cluster nodes in graphs