$30 off During Our Annual Pro Sale. View Details »
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending S...
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.5k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
論文紹介「Evaluation gaps in machine learning practice」と、効果検証入門に関する昔話
stakaya
0
970
バイブコーディングの正体——AIエージェントはソフトウェア開発を変えるか?
stakaya
5
1.4k
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
560
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
2k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
2k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
640
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
29
21k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
1.2k
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
360
Other Decks in Research
See All in Research
大学見本市2025 JSTさきがけ事業セミナー「顔の見えないセンシング技術:多様なセンサにもとづく個人情報に配慮した人物状態推定」
miso2024
0
190
説明可能な機械学習と数理最適化
kelicht
2
760
POI: Proof of Identity
katsyoshi
0
120
大規模言語モデルにおけるData-Centric AIと合成データの活用 / Data-Centric AI and Synthetic Data in Large Language Models
tsurubee
1
460
Sat2City:3D City Generation from A Single Satellite Image with Cascaded Latent Diffusion
satai
4
390
AIスパコン「さくらONE」のLLM学習ベンチマークによる性能評価 / SAKURAONE LLM Training Benchmarking
yuukit
2
910
AIグラフィックデザインの進化:断片から統合(One Piece)へ / From Fragment to One Piece: A Survey on AI-Driven Graphic Design
shunk031
0
580
学習型データ構造:機械学習を内包する新しいデータ構造の設計と解析
matsui_528
5
2.2k
超高速データサイエンス
matsui_528
1
320
さまざまなAgent FrameworkとAIエージェントの評価
ymd65536
1
370
[IBIS 2025] 深層基盤モデルのための強化学習驚きから理論にもとづく納得へ
akifumi_wachi
19
9k
教師あり学習と強化学習で作る 最強の数学特化LLM
analokmaus
2
770
Featured
See All Featured
Build your cross-platform service in a week with App Engine
jlugia
234
18k
Site-Speed That Sticks
csswizardry
13
1k
Designing Powerful Visuals for Engaging Learning
tmiket
0
190
How to make the Groovebox
asonas
2
1.8k
Marketing to machines
jonoalderson
1
4.3k
Information Architects: The Missing Link in Design Systems
soysaucechin
0
710
30 Presentation Tips
portentint
PRO
1
170
Future Trends and Review - Lecture 12 - Web Technologies (1019888BNR)
signer
PRO
0
3.1k
Are puppies a ranking factor?
jonoalderson
0
2.4k
Designing for humans not robots
tammielis
254
26k
実際に使うSQLの書き方 徹底解説 / pgcon21j-tutorial
soudai
PRO
196
70k
Measuring & Analyzing Core Web Vitals
bluesmoon
9
710
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34