Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Collaborative Topic Modeling for Recommending Scientific Articles

Collaborative Topic Modeling for Recommending Scientific Articles

論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド

8a7e83d2e447783ab6d824f553429a09?s=128

Shinichi Takayanagi

May 30, 2016
Tweet

Transcript

  1. RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,

    David M. Blei 高柳慎一
  2. (C)Recruit Communications Co., Ltd. ABSTRACT 1

  3. (C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2

  4. (C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3

  5. (C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4

  6. (C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks

    5
  7. (C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6

  8. (C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7

  9. (C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8

  10. (C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9

  11. (C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10

  12. (C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11

  13. (C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12

  14. (C)Recruit Communications Co., Ltd. LDAの生成過程 13

  15. (C)Recruit Communications Co., Ltd. LDAの特徴 14

  16. (C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15

  17. (C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16

  18. (C)Recruit Communications Co., Ltd. CTRの生成過程 17

  19. (C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18

  20. (C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19

  21. (C)Recruit Communications Co., Ltd. 学習のさせ方 20

  22. (C)Recruit Communications Co., Ltd. 学習のさせ方 21

  23. (C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22

  24. (C)Recruit Communications Co., Ltd. 学習のさせ方 23

  25. (C)Recruit Communications Co., Ltd. 予測 24

  26. (C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25

  27. (C)Recruit Communications Co., Ltd. データの規模感 26

  28. (C)Recruit Communications Co., Ltd. 評価 27

  29. (C)Recruit Communications Co., Ltd. 結果 28

  30. (C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29

    数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
  31. (C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出

  32. (C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当

  33. (C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当

  34. (C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当

  35. (C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34