Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Collaborative Topic Modeling for Recommending S...
Search
Shinichi Takayanagi
May 30, 2016
Research
0
1.5k
Collaborative Topic Modeling for Recommending Scientific Articles
論文"Collaborative Topic Modeling for Recommending Scientific Articles"を読んだ際に使用したスライド
Shinichi Takayanagi
May 30, 2016
Tweet
Share
More Decks by Shinichi Takayanagi
See All by Shinichi Takayanagi
[NeurIPS 2023 論文読み会] Wasserstein Quantum Monte Carlo
stakaya
0
520
[KDD2021 論文読み会] ControlBurn: Feature Selection by Sparse Forests
stakaya
2
1.9k
[ICML2021 論文読み会] Mandoline: Model Evaluation under Distribution Shift
stakaya
0
2k
[情報検索/推薦 各社合同 論文読み祭 #1] KDD ‘20 "Embedding-based Retrieval in Facebook Search"
stakaya
2
610
【2020年新人研修資料】ナウでヤングなPython開発入門
stakaya
29
21k
論文読んだ「Simple and Deterministic Matrix Sketching」
stakaya
1
1.1k
Quick Introduction to Approximate Bayesian Computation (ABC) with R"
stakaya
3
340
The Road to Machine Learning Engineer from Data Scientist
stakaya
5
4.3k
論文読んだ「Winner’s Curse: Bias Estimation for Total Effects of Features in Online Controlled Experiments」
stakaya
1
4.7k
Other Decks in Research
See All in Research
業界横断 副業・兼業者の実態調査
fkske
0
190
データサイエンティストの採用に関するアンケート
datascientistsociety
PRO
0
1k
言語モデルによるAI創薬の進展 / Advancements in AI-Driven Drug Discovery Using Language Models
tsurubee
2
380
学生向けアンケート<データサイエンティストについて>
datascientistsociety
PRO
0
3.5k
実行環境に中立なWebAssemblyライブマイグレーション機構/techtalk-2025spring
chikuwait
0
230
生成的推薦の人気バイアスの分析:暗記の観点から / JSAI2025
upura
0
200
RapidPen: AIエージェントによるペネトレーションテスト 初期侵入全自動化の研究
laysakura
0
1.6k
SSII2025 [SS2] 横浜DeNAベイスターズの躍進を支えたAIプロダクト
ssii
PRO
7
3.6k
心理言語学の視点から再考する言語モデルの学習過程
chemical_tree
2
410
CSP: Self-Supervised Contrastive Spatial Pre-Training for Geospatial-Visual Representations
satai
3
220
Vision And Languageモデルにおける異なるドメインでの継続事前学習が性能に与える影響の検証 / YANS2024
sansan_randd
1
110
プロシェアリング白書2025_PROSHARING_REPORT_2025
circulation
1
890
Featured
See All Featured
Building Applications with DynamoDB
mza
95
6.5k
I Don’t Have Time: Getting Over the Fear to Launch Your Podcast
jcasabona
32
2.4k
The MySQL Ecosystem @ GitHub 2015
samlambert
251
13k
Thoughts on Productivity
jonyablonski
69
4.7k
Performance Is Good for Brains [We Love Speed 2024]
tammyeverts
10
950
VelocityConf: Rendering Performance Case Studies
addyosmani
332
24k
Build The Right Thing And Hit Your Dates
maggiecrowley
36
2.8k
A Tale of Four Properties
chriscoyier
160
23k
Music & Morning Musume
bryan
46
6.6k
Put a Button on it: Removing Barriers to Going Fast.
kastner
60
3.9k
Practical Orchestrator
shlominoach
189
11k
Adopting Sorbet at Scale
ufuk
77
9.5k
Transcript
RCO論文輪読会(2016/05/27) “Collaborative topic modeling for recommending scientific articles”(KDD2011) Chong Wang,
David M. Blei 高柳慎一
(C)Recruit Communications Co., Ltd. ABSTRACT 1
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 2
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 3
(C)Recruit Communications Co., Ltd. 1. INTRODUCTION 4
(C)Recruit Communications Co., Ltd. 2. BACKGROUND & 2.1 Recommendation Tasks
5
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 6
(C)Recruit Communications Co., Ltd. 2.1 Recommendation Tasks 7
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 8
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 9
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 10
(C)Recruit Communications Co., Ltd. 2.2 Recommendation by Matrix Factorization 11
(C)Recruit Communications Co., Ltd. 2.3 Probabilistic Topic Models 12
(C)Recruit Communications Co., Ltd. LDAの生成過程 13
(C)Recruit Communications Co., Ltd. LDAの特徴 14
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 15
(C)Recruit Communications Co., Ltd. COLLABORATIVE TOPIC REGRESSION 16
(C)Recruit Communications Co., Ltd. CTRの生成過程 17
(C)Recruit Communications Co., Ltd. 3. COLLABORATIVE TOPIC REGRESSION 18
(C)Recruit Communications Co., Ltd. CTRのモデルのRegressionたる所以 19
(C)Recruit Communications Co., Ltd. 学習のさせ方 20
(C)Recruit Communications Co., Ltd. 学習のさせ方 21
(C)Recruit Communications Co., Ltd. 簡単な証明 by iPad手書き 22
(C)Recruit Communications Co., Ltd. 学習のさせ方 23
(C)Recruit Communications Co., Ltd. 予測 24
(C)Recruit Communications Co., Ltd. 4. EMPIRICAL STUDY 25
(C)Recruit Communications Co., Ltd. データの規模感 26
(C)Recruit Communications Co., Ltd. 評価 27
(C)Recruit Communications Co., Ltd. 結果 28
(C)Recruit Communications Co., Ltd. 結果 (ライブラリ内の論文数(Fig 5)・ある論文をLikeした数(Fig 6) 依存性) 29
数が増えると Recallが下がる (あまり有名な論文じゃ ないのを出すため) 数が増えると Recallが上がる (みんな見てる論文 だとCFがうまく動く)
(C)Recruit Communications Co., Ltd. 結果(ある2ユーザの好んだトピックを抽出) 30 トピックの潜 在ベクトルの 重みをランキ ングして抽出
(C)Recruit Communications Co., Ltd. 結果(オフセットの大きかった論文BEST 10) 31 ※内容よりもCFが効くケースに相当
(C)Recruit Communications Co., Ltd. 結果(EMの論文がベイズ統計勢にもよく参照されている例) 32 ※内容よりもCFが効く ケースに相当
(C)Recruit Communications Co., Ltd. 結果(逆にトピックが広がらない例) 33 ※内容が支配的なケー スに相当
(C)Recruit Communications Co., Ltd. 5. CONCLUSIONS AND FUTURE WORK 34