Slide 1

Slide 1 text

RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang, David M. Blei 高柳慎一

Slide 2

Slide 2 text

(C)Recruit Communications Co., Ltd. ABSTRACT 1

Slide 3

Slide 3 text

(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2

Slide 4

Slide 4 text

(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3

Slide 5

Slide 5 text

(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4

Slide 6

Slide 6 text

(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks 5

Slide 7

Slide 7 text

(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6

Slide 8

Slide 8 text

(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7

Slide 9

Slide 9 text

(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8

Slide 10

Slide 10 text

(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9

Slide 11

Slide 11 text

(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10

Slide 12

Slide 12 text

(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11

Slide 13

Slide 13 text

(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12

Slide 14

Slide 14 text

(C)Recruit Communications Co., Ltd. LDAの生成過程 13

Slide 15

Slide 15 text

(C)Recruit Communications Co., Ltd. LDAの特徴 14

Slide 16

Slide 16 text

(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15

Slide 17

Slide 17 text

(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16

Slide 18

Slide 18 text

(C)Recruit Communications Co., Ltd. CTRの生成過程 17

Slide 19

Slide 19 text

(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18

Slide 20

Slide 20 text

(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19

Slide 21

Slide 21 text

(C)Recruit Communications Co., Ltd. 学習のさせ方 20

Slide 22

Slide 22 text

(C)Recruit Communications Co., Ltd. 学習のさせ方 21

Slide 23

Slide 23 text

(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22

Slide 24

Slide 24 text

(C)Recruit Communications Co., Ltd. 学習のさせ方 23

Slide 25

Slide 25 text

(C)Recruit Communications Co., Ltd. 予測 24

Slide 26

Slide 26 text

(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25

Slide 27

Slide 27 text

(C)Recruit Communications Co., Ltd. データの規模感 26

Slide 28

Slide 28 text

(C)Recruit Communications Co., Ltd. 評価 27

Slide 29

Slide 29 text

(C)Recruit Communications Co., Ltd. 結果 28

Slide 30

Slide 30 text

(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29 数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)

Slide 31

Slide 31 text

(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出

Slide 32

Slide 32 text

(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当

Slide 33

Slide 33 text

(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当

Slide 34

Slide 34 text

(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当

Slide 35

Slide 35 text

(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34