Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Neural Networks for Graph Data @NeurIPS2018

Neural Networks for Graph Data @NeurIPS2018

NeurIPS2018でのグラフニューラルネットワークに関する論文を3本紹介します。

Ryosuke Kamesawa

January 26, 2019
Tweet

More Decks by Ryosuke Kamesawa

Other Decks in Technology

Transcript

  1. 自己紹介 - 亀澤諒亮 (かめさわりょうすけ) - Twitter: @cruelturtle - ~ 2018.03

    東京大学情報理工 修士 - 研究:ガウス過程、PAC Bayes - 2018.04 ~ DeNA - 研究開発エンジニア - AI 創薬 2
  2. Outline - Graph について - Graph neural networks @ NeurIPS

    - Spotlight papers の紹介 - Hierarchical Graph Representation Learning with Differentiable Pooling [Ying+, NeurIPS’18] - Link Prediction Based on Graph Neural Networks [Zhang+, NeurIPS’18] - Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation [You+, NeurIPS’18] 3
  3. グラフ - 頂点(node)と辺(edge, link)からなるデータ構造 - 今回出てくるものは - 基本的に無向グラフ(辺の向きは考えない) - 頂点

    v が特徴ベクトル x v をもつ - 具体的には特徴行列と隣接行列として表現 - 特徴行列 - 隣接行列 4
  4. グラフ機械学習の主なタスク - Node classification - 頂点につけられたラベルを予測 - E.g. ソーシャルネットワークでのユーザの分類 -

    Graph classification - グラフにつけられたラベルを予測 - E.g. 化合物グラフでの活性有無の分類 - Link prediction - 頂点間に辺があるかどうかを予測 - E.g. 商品推薦 - ユーザーとアイテムが頂点であるような二部グラフ 5
  5. GNNに関する論文数(本会議) 6 Task NeurIPS 2018 NIPS 2017 Node classification 2

    2 Graph classification 1 0 Link prediction 2 1 Graph generation 3 0 Logical / combinatorial task 3 2 Representation in visual task 6 1 Total 17 6
  6. 論文紹介 (Spotlight) 1. Hierarchical Graph Representation Learning with Differentiable Pooling

    [Ying+, NeurIPS’18] 2. Link Prediction Based on Graph Neural Networks [Zhang+, NeurIPS’18] 3. Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation [You+, NeurIPS’18] 7
  7. Hierarchical Graph Representation Learning with Differentiable Pooling [Ying+, NeurIPS’18] -

    Graph classification における”階層性”の欠如を指摘 - 仮説:GNNにおいても階層的なpoolingは特徴抽出に有効 - End-to-endで学習可能なGNNにおけるpoolingを提案 - 多くのベンチマークで既存手法より高いaccuracy 8 https://papers.nips.cc/paper/7729-hierarchical-graph-representation-learning-with-differentiable-pooling.pdf
  8. Graph Convolution - Graph convolutional networks [Kipf & Welling, ICLR’17]

    9 https://tkipf.github.io/graph-convolutional-networks/
  9. Graph Convolution - Graph convolutional networks [Kipf & Welling, ICLR’17]

    10 https://tkipf.github.io/graph-convolutional-networks/ (正則化された)隣接行列
  10. Graph Convolution - Graph convolutional networks [Kipf & Welling, ICLR’17]

    11 https://tkipf.github.io/graph-convolutional-networks/ 各頂点の中間層の線形変換
  11. Graph Convolution - Graph convolutional networks [Kipf & Welling, ICLR’17]

    12 https://tkipf.github.io/graph-convolutional-networks/ 隣接する頂点の中間層の線形変換の和
  12. CNN vs. GNN - Grid graph ↔ General graph -

    Convolution ↔ Graph convolution - Pooling ↔ ??? 14 https://www.jeremyjordan.me/convnet-architectures/ Standard CNN
  13. Differentiable Pooling (DIFFPOOL) [Ying+, NeurIPS’18] グラフ上のpoolingをソフトクラスタリングとして定義 - 各頂点が次の層でそれぞれのクラスタに属する確率を 別のネットワークで予測 -

    隣接する頂点が同じクラスタに属するように 明示的に正則化を加える 15 https://papers.nips.cc/paper/7729-hierarchical-graph-representation-learning-with-differentiable-pooling.pdf
  14. Differentiable Pooling (DIFFPOOL) [Ying+, NeurIPS’18] DIFFPOOL: - 隣接行列 - 特徴行列

    19 クラスタに割り当てられる頂点の 特徴ベクトルの和
  15. Differentiable Pooling (DIFFPOOL) [Ying+, NeurIPS’18] DIFFPOOL: - 隣接行列 - 特徴行列

    20 A(l) での構造を保った隣接行列 v → V, w → Wに割り当てられるとき vwに辺があればVWにも辺がある
  16. Differentiable Pooling (DIFFPOOL) [Ying+, NeurIPS’18] 正則化 - Link prediction 正則化

    - 隣接する頂点が同じクラスタになるように - エントロピー正則化 - どれか一つのクラスタに属する確率が大きくなるように 21
  17. DIFFPOOL まとめ - グラフデータにおけるpooling手法を提案 - End-to-end で学習可能 - 階層的にすることができる -

    ただし、ソフトクラスタリングをするための 追加のネットワークが必要 - 多くのベンチマークでSotA 24
  18. Link Prediction Based on Graph Neural Networks [Zhang+, NeurIPS’18] -

    理論:既存のheuristicsはenclosing subgraphから近似できる - GNNを使ったlink predictionを提案 - 多くのベンチマークで既存手法より高いAUC 25 http://papers.nips.cc/paper/7763-link-prediction-based-on-graph-neural-networks.pdf
  19. 𝛾-decaying heuristics - 次のように計算できるheuristics - 定理. 次を満たすとき、𝛾-decaying heuristics を x,y

    を中心とするh-hop enclosing subgraph を使って 近似したときの誤差は、hに対して指数的に小さくなる。 - - が の関数 - Heuristics自体を学習するなら enclosing subgraphだけから近似してもよい 27 辺の両端の頂点
  20. 𝛾-decaying heuristics - 次のように計算できるheuristics - 定理. 次を満たすとき、𝛾-decaying heuristics を x,y

    を中心とするh-hop enclosing subgraph を使って 近似したときの誤差は、hに対して指数的に小さくなる。 - - が の関数 - Heuristics自体を学習するなら enclosing subgraphだけから近似してもよい 28
  21. Algorithm: SEAL SEAL (learning from Subgraphs, Embeddings and Attributes for

    Link prediction) - enclosing subgraphを入力にしたgraph classificication - DGCNN [Zhang+, AAAI’18] をモデル - 頂点の特徴ベクトル・埋め込みベクトルを扱える - 問題点 - 辺の両端になる頂点とそうでない頂点を区別する必要 29 http://papers.nips.cc/paper/7763-link-prediction-based-on-graph-neural-networks.pdf
  22. Algorithm: SEAL Node labeling (DRNL) - 辺の両端 x, y からの距離に応じてラベリング

    - 頂点の埋め込みベクトルにDRNLを1-hotにして追加 30 http://papers.nips.cc/paper/7763-link-prediction-based-on-graph-neural-networks.pdf
  23. Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation [You+,

    NIPS’18] - 強化学習を用いたグラフ生成を提案 - ドメイン固有の目的関数を報酬とすることで最適化できる - 既存手法に比べて高いvalidity とよりよい目的関数の最適化 33 http://papers.nips.cc/paper/7877-graph-convolutional-policy-network-for-goal-directed-molecular-graph-generation.pdf
  24. Molecular graph generation - Drug discoveryやmaterial scienceでは分子設計が問題になる - 現在は人間が設計 →

    自動化したい - 分子設計の難しさ - 空間が膨大 - 次数などのグラフ上の制約 - 目的関数の最適化 - DNNの発展に伴ってGANやVAEを用いた手法が提案されている 34
  25. 分子生成における既存研究 35 Sequence Graph RNN ChemTS [Yang+, J. Cheminfo‘17], [Ertl+,

    ‘17], [Bjerrum+, ‘17], [Segler+, ACS Cent. Sci.‘18] [Li+, J. Cheminfo‘18] RNN / RL [Olivecrona+, J. Cheminfo‘17] GraphRNN [You+, ICML‘18] VAE SD-VAE [Dai+, ‘18], [Gómez-Bombarelli+, ‘17], Grammar-VAE [Kusner+, ICML‘17], Character-VAE [Gómez-Bombarelli+, ACS Cent. Sci.‘16] JT-VAE [Jin+, ICML‘18], GraphVAE [Simonovsky+, ‘18] [Ma+, NIPS‘18], [Liu+, NIPS’18], GAN / RL ORGAN [Guimaraes+, ‘17], ORGANIC [Sanchez-Lengeling+, ‘17] GCPN [You+, NIPS‘18]
  26. Graph generation environment 1つの頂点からスタートして逐次的に頂点を追加する - 状態:生成途中のグラフ - 行動:辺、頂点の追加 - 生成途中のグラフ上の頂点の選択

    - もう片方の頂点の選択 - 辺ラベルの選択 - 状態遷移: 新たに生成したグラフが有効な場合のみ、そのグラフに遷移 37 http://papers.nips.cc/paper/7877-graph-convolutional-policy-network-for-goal-directed-molecular-graph-generation.pdf
  27. Reward design - Intermediate :step reward + advesarial reward -

    Final:domain-specific reward + adversarial reward Step reward - 有効なグラフの場合、𝜀 - 無効なグラフの場合、-𝜀 Adversarial reward - Discriminatorはグラフを入力とするGNN - 方策と同時に学習 38
  28. Policy network GCNによる頂点の埋め込み X から次のように行動を決定 Proximal policy optimization [Schulman+, ‘17]

    で学習 - 既知のグラフから軌跡をサンプルして事前学習 39 http://papers.nips.cc/paper/7877-graph-convolutional-policy-network-for-goal-directed-molecular-graph-generation.pdf
  29. GCPN まとめ - 分子生成の問題を強化学習として定式化 - 生成途中のグラフを状態として 行動によってグラフを成長させる - Adversarial rewardによって

    ”分子らしい”分子を生成できる - 報酬の設計によって様々なタスクを解ける - 既存法より高いvalidity でより望ましい分子の生成ができた 44
  30. Library for Graph Neural Networks - pytorch_geometric - https://github.com/rusty1s/pytorch_geometric -

    DeepGraphLibrary - https://www.dgl.ai - Chainer chemistry - https://github.com/pfnet-research/chainer-chemistry 45
  31. まとめ 46 - 今年のNuerIPSではGNNを扱う論文数が大きく増えた - 特に molecular generation & computer

    vision - Spotlight paper 3本の紹介 - Hierarchical differentiable pooling - Link prediction based on GNN - Graph convolutional policy network - 応用するための基本的な道具は揃いつつある印象 - 何にどう応用するか、が焦点になってきている
  32. NeurIPS 2018 でのGNN関連の論文 (1) - Node classification - Adaptive Sampling

    Towards Fast Graph Representation Learning [Huang+] - Mean-field theory of graph neural networks in graph partitioning [Kawamoto+] - Graph classification - Hierarchical Graph Representation Learning with Differentiable Pooling [Ying+] - Link prediction - Link Prediction Based on Graph Neural Networks [Zhang+] - SimplE Embedding for Link Prediction in Knowledge Graphs [Kazemi+] - Graph generation - Constrained Generation of Semantically Valid Graphs via Regularizing Variational Autoencoders [Ma+] - Constrained Graph Variational Autoencoders for Molecule Design [Liu+] - Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation [You+] 47
  33. NeurIPS 2018 でのGNN関連の論文 (2) - Logical / combinatorial task -

    Combinatorial Optimization with Graph Convolutional Networks and Guided Tree Search [Li+] - Embedding Logical Queries on Knowledge Graphs [Hamilton+] - Recurrent Relational Networks [Palm+] - Representation in visual task - Beyond Grids: Learning Graph Representations for Visual Recognition [Li+] - Learning Conditioned Graph Structures for Interpretable Visual Question Answering [Norcliffe-Brown+] - LinkNet: Relational Embedding for Scene Graph [Woo+] - Mapping Images to Scene Graphs with Permutation-Invariant Structured Prediction [Herzig+] - Out of the Box: Reasoning with Graph Convolution Nets for Factual Visual Question Answering [Narasimhan+] - Symbolic Graph Reasoning Meets Convolutions [Liang+] 48
  34. (参考)NIPS 2017 でのGNN関連の論文 - Node classification - Inductive Representation Learning

    on Large Graphs [Hamilton+] - Learning Graph Representations with Embedding Propagation [Duran+] - Representation in vision - Pixels to Graphs by Associative Embedding [Newell+] - Logical / combinatorial task - Premise Selection for Theorem Proving by Deep Graph Embedding [Wang+] - Learning Combinatorial Optimization Algorithms over Graphs [Khalil+] - Link prediction - Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks [Monti+] - Other - Protein Interface Prediction using Graph Convolutional Networks [Fout+] 49
  35. 参考文献 W. Huang, T. Zhang, Y. Rong, and J. Huang,

    “Adaptive Sampling Towards Fast Graph Representation Learning,” NIPS 2018. Y. Li and A. Gupta, “Beyond Grids: Learning Graph Representations for Visual Recognition,” NIPS 2018. Z. Li, Q. Chen, and V. Koltun, “Combinatorial Optimization with Graph Convolutional Networks and Guided Tree Search,” NIPS 2018. T. Ma, J. Chen, and C. Xiao, “Constrained Generation of Semantically Valid Graphs via Regularizing Variational Autoencoders,” NIPS 2018. Q. Liu, M. Allamanis, M. Brockschmidt, and A. Gaunt, “Constrained Graph Variational Autoencoders for Molecule Design,” NIPS 2018. W. Hamilton, P. Bajaj, M. Zitnik, D. Jurafsky, and J. Leskovec, “Embedding Logical Queries on Knowledge Graphs,” NIPS 2018. J. You, B. Liu, Z. Ying, V. Pande, and J. Leskovec, “Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation,” NIPS 2018. M. Simonovsky and N. Komodakis, “GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders,” arXiv:1802.03480 [cs], Feb. 2018. R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. Leskovec, “Hierarchical Graph Representation Learning with Differentiable Pooling,” NIPS 2018. P. Ertl, R. Lewis, E. Martin, and V. Polyakov, “In silico generation of novel, drug-like chemical matter using the LSTM neural network,” arXiv:1712.07449 [cs, q-bio], Dec. 2017. W. Norcliffe-Brown, S. Vafeias, and S. Parisot, “Learning Conditioned Graph Structures for Interpretable Visual Question Answering,” NIPS 2018. A. Garcia Duran and M. Niepert, “Learning Graph Representations with Embedding Propagation,” NIPS 2017. M. Zhang and Y. Chen, “Link Prediction Based on Graph Neural Networks,” NIPS 2018. S. Woo, D. Kim, D. Cho, and I. S. Kweon, “LinkNet: Relational Embedding for Scene Graph,” NIPS 2018. R. Herzig, M. Raboh, G. Chechik, J. Berant, and A. Globerson, “Mapping Images to Scene Graphs with Permutation-Invariant Structured Prediction,” NIPS 2018. M. Olivecrona, T. Blaschke, O. Engkvist, and H. Chen, “Molecular De Novo Design through Deep Reinforcement Learning,” arXiv:1704.07555 [cs], Apr. 2017. E. J. Bjerrum and R. Threlfall, “Molecular Generation with Recurrent Neural Networks (RNNs),” arXiv:1705.04612 [cs, q-bio], May 2017. S. Kearnes, K. McCloskey, M. Berndl, V. Pande, and P. Riley, “Molecular Graph Convolutions: Moving Beyond Fingerprints,” Journal of Computer-Aided Molecular Design, vol. 30, no. 8, pp. 595–608, Aug. 2016. Y. Li, L. Zhang, and Z. Liu, “Multi-Objective De Novo Drug Design with Conditional Graph Generative Model,” arXiv:1801.07299 [cs, q-bio], Jan. 2018. A. Grover and J. Leskovec, “node2vec: Scalable Feature Learning for Networks,” KDD 2016. M. Narasimhan, S. Lazebnik, and A. Schwing, “Out of the Box: Reasoning with Graph Convolution Nets for Factual Visual Question Answering,” NIPS 2018. A. Newell and J. Deng, “Pixels to Graphs by Associative Embedding,” NIPS 2017. M. Wang, Y. Tang, J. Wang, and J. Deng, “Premise Selection for Theorem Proving by Deep Graph Embedding,” NIPS 2017. A. Fout, J. Byrd, B. Shariat, and A. Ben-Hur, “Protein Interface Prediction using Graph Convolutional Networks,” NIPS 2017. T. N. Kipf and M. Welling, “Semi-Supervised Classification with Graph Convolutional Networks,” ICLR 2017. S. M. Kazemi and D. Poole, “SimplE Embedding for Link Prediction in Knowledge Graphs,” NIPS 2018. X. Liang, Z. Hu, H. Zhang, L. Lin, and E. P. Xing, “Symbolic Graph Reasoning Meets Convolutions,” NIPS 2018. S. Abu-El-Haija, B. Perozzi, R. Al-Rfou, and A. A. Alemi, “Watch Your Step: Learning Node Embeddings via Graph Attention,” NIPS 2018. M. Zhang, Z. Cui, M. Neumann, and Y. Chen, “An End-to-End Deep Learning Architecture for Graph Classification,” AAAI 2018. F. Monti, M. Bronstein, and X. Bresson, “Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks,” NIPS 2017. E. Khalil, H. Dai, Y. Zhang, B. Dilkina, and L. Song, “Learning Combinatorial Optimization Algorithms over Graphs,” NIPS 2017. W. Hamilton, Z. Ying, and J. Leskovec, “Inductive Representation Learning on Large Graphs,” NIPS 2017. A. Garcia Duran and M. Niepert, “Learning Graph Representations with Embedding Propagation,” NIPS 2017. 50