Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Approximation Ratios of Graph Neural Networks for Combinatorial Problems

knshnb
October 03, 2019

Approximation Ratios of Graph Neural Networks for Combinatorial Problems

knshnb

October 03, 2019
Tweet

More Decks by knshnb

Other Decks in Research

Transcript

  1. TL;DR ✴ What’s this paper? ‣ Establish stronger GNN model

    with knowledge from distributed local algorithm ‣ Analyze GNN’s approximation ratios for combinatorial problems ‣ Prove preprocessing strengthens representation power ✴ Why this paper? ‣ GNN intuitively corresponds with distributed local algorithm ‣ Theoretically interesting result on preprocessing 2
  2. Related Work Few studies on theoretical analysis of GNN ✴

    [Xu+ ICLR 2019] ‣ Representation power in terms of graph isomorphism ‣ Compare with WL isomorphism test (heuristic algorithm) ✴ This paper ‣ Representation power to solve other combinatorial problems ‣ Compare with distributed local algorithm 3
  3. Distributed Local Algorithm ✴ Graph’s each node has a processor

    with infinite computational resources ✴ In each step, each processor synchronously 1. Send messages to neighbors 2. Receive messages from neighbors 3. Update its state ✴ Decides the output within a constant number of steps ✴ Usually assume graph degree is bounded by A Δ 5
  4. Developed models ✴ Deterministic algorithm for identical processors is known

    to be weak ✴ Many developed models ‣ Port numbering, unique identifiers, randomness, etc. 6
  5. Port Numbering ✴ Number each port of a node as

    ‣ : node sending message to port of node ‣ : port of node to port of node ✴ Consistent port numbering: ‣ Computed in linear time v 1,2,⋯, deg(v) ptail (v, i) i v pn (v, i) ptail (v, i) i v ∀(v, i) . p(p(v, i)) = (v, i) 7 Inconsistent port nubmering Consistent port nubmering https://arxiv.org/pdf/1205.2051.pdf
  6. Models Classes [Hella+ 2015] ✴ Set: ✴ Multiset: ✴ Broadcast:

    send same messages to all neighbors set( ⃗ a ) = set( ⃗ b ) ⇒ f( ⃗ a ) = f( ⃗ b ) multiset( ⃗ a ) = multiset( ⃗ b ) ⇒ f( ⃗ a ) = f( ⃗ b ) 8 Incoming Messages Outgoing Messages SB(1) Set Broadcast MB(1) Multiset Broadcast VVc(1) Vector Vector
  7. SB-GNN ✴ ✴ Multiple values don’t count ✴ GraphSAGE-pool ‣

    zl+1 v = fl(zl v , SET(zl u |u ∈ N(v))) zl+1 v = max({σ(Wlzl u + bl)|u ∈ N(v)}) 10
  8. MB-GNN ✴ ✴ Permutation invariant ✴ GraphSAGE-mean ‣ ✴ GAT,

    GIN, etc. zl+1 v = fl(zl v , MULTISET(zl u |u ∈ N(v))) zl+1 v = CONCAT(zl v , 1 |N(v)| ∑ u∈N(v) Wlzl u ) 11
  9. Proposed: VVc-GNN ✴ ✴ Handle messages from different nodes differently

    ✴ CPNGNN (Proposed!!) ‣ ‣ Most powerful among VVc-GNNs (if node feature is finite) zl+1 v = fl(zl v , zl ptail (v,1) , pn (v,1), zl ptail (v,2) , pn (v,2), ⋯, zl ptail (v,Δ) , pn (v, Δ)) zl+1 v = σ(WlCONCAT(zl v , zl ptail (v,1) , pn (v,1), zl ptail (v,2) , pn (v,2), ⋯, zl ptail (v,Δ) , pn (v, Δ))) 12
  10. Hierarchy of GNNs ✴ Representation power correspondence ‣ SB-GNN =

    SB(1) ‣ MB-GNN = MB(1) ‣ VVc-GNN = VVc(1) ✴ SB-GNN MB-GNN VVc-GNN (Natural by definition) ‣ Strict order? ≤ ≤ 13
  11. VVc-GNN strictly stronger than MB-GNN? ✴ Finding single leaf problem

    ‣ Select only a single leaf from a star graph ‣ MB-GNN cannot solve • Leaf embeddings are always same ‣ VVc-GNN can solve ✴ SB-GNN MB-GNN VVc-GNN < < 14 https://en.wikipedia.org/wiki/Star_(graph_theory)
  12. Approximation ratio of VVc-GNN Give degree information as node input

    feature ✴ Minimum Dominating Set ‣ ✴ Minimum Vertex Cover ‣ 2 ✴ Maximum Matching ‣ No approximation for any Δ + 1 α α > 0 16
  13. Give more information as node feature ✴ Only degree information

    might be too poor ✴ Weak 2-coloring ‣ Any graph has a weak 2-coloring ‣ Calculated in linear time by breadth-first search 17
  14. Approximation ratio of VVc-GNN Give degree and weak 2-coloring information

    as node input feature ✴ Minimum Dominating Set ‣ → ✴ Minimum Vertex Cover ‣ 2 ✴ Maximum Matching ‣ No approximation for any → Δ + 1 Δ + 1 2 α α > 0 Δ + 1 2 18
  15. Summary ✴ Establish stronger GNN model with knowledge from distributed

    local algorithm ✴ Analyze GNN’s approximation ratios for combinatorial problems ✴ Prove preprocessing strengthens representation power 20
  16. My Thoughts ✴ Impressed with how well distributed local algorithm

    represents GNN ✓Number of layer is constant - Degree upper bound assumption ✴ Proposed network seems too artificial (concatenating integer to feature vectors) - Does it actually learn well? 21
  17. References 1. R. Sato, M. Yamada, and H. Kashima. Approximation

    Ratios of Graph Neural Networks for Combinatorial Problems. In NeurIPS 2019. 2. K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How Powerful are Graph Neural Networks? In ICLR 2019. 3. Hella et al. Weak Models of Distributed Computing, with Connections to Modal Logic. 22