Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending S...
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.5k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
論文紹介「Evaluation gaps in machine learning practice」と、効果検証入門に関する昔話
stakaya
0
930
バイブコーディングの正体——AIエージェントはソフトウェア開発を変えるか?
stakaya
5
1.3k
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
550
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
2k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
2k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
630
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
29
21k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
1.2k
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
360
Other Decks in Research
See All in Research
なめらかなシステムと運用維持の終わらぬ未来 / dicomo2025_coherently_fittable_system
monochromegane
0
5k
Nullspace MPC
mizuhoaoki
1
370
まずはここから:Overleaf共同執筆・CopilotでAIコーディング入門・Codespacesで独立環境
matsui_528
2
760
若手研究者が国際会議(例えばIROS)でワークショップを企画するメリットと成功法!
tanichu
0
110
AIスパコン「さくらONE」の オブザーバビリティ / Observability for AI Supercomputer SAKURAONE
yuukit
2
870
[IBIS 2025] 深層基盤モデルのための強化学習驚きから理論にもとづく納得へ
akifumi_wachi
12
6.8k
とあるSREの博士「過程」 / A Certain SRE’s Ph.D. Journey
yuukit
11
4.9k
日本語新聞記事を用いた大規模言語モデルの暗記定量化 / LLMC2025
upura
0
340
PhD Defense 2025: Visual Understanding of Human Hands in Interactions
tkhkaeio
1
300
ドメイン知識がない領域での自然言語処理の始め方
hargon24
1
160
MIRU2025 チュートリアル講演「ロボット基盤モデルの最前線」
haraduka
15
10k
LLM-jp-3 and beyond: Training Large Language Models
odashi
1
620
Featured
See All Featured
The World Runs on Bad Software
bkeepers
PRO
72
12k
KATA
mclloyd
PRO
32
15k
[RailsConf 2023] Rails as a piece of cake
palkan
57
6.1k
Building Adaptive Systems
keathley
44
2.8k
実際に使うSQLの書き方 徹底解説 / pgcon21j-tutorial
soudai
PRO
192
62k
Unsuck your backbone
ammeep
671
58k
The Art of Delivering Value - GDevCon NA Keynote
reverentgeek
16
1.8k
Balancing Empowerment & Direction
lara
5
760
The Power of CSS Pseudo Elements
geoffreycrofte
80
6.1k
[RailsConf 2023 Opening Keynote] The Magic of Rails
eileencodes
31
9.8k
Put a Button on it: Removing Barriers to Going Fast.
kastner
60
4.1k
Side Projects
sachag
455
43k
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34