Incremental Factorization Machines

37130a5f1550eb2d91e640cedf907a78?s=47 Takuya Kitazawa
September 15, 2016

Incremental Factorization Machines

Presented at Workshop on Profiling User Preferences for Dynamic Online and Real-Time Recommendations (RecProfile in conjunction with RecSys 2016)

arXiv paper: https://arxiv.org/abs/1607.02858

37130a5f1550eb2d91e640cedf907a78?s=128

Takuya Kitazawa

September 15, 2016
Tweet

Transcript

  1. Incremental Factorization Machines for Persistently Cold-starting Online Item Recommendation Takuya

    Kitazawa Graduate School of Information Science and Technology, The University of Tokyo RecProfile 2016
  2. Incremental Factorization Machines for (1) Persistently Cold-starting, (2) Online Item

    Recommendation as new baseline method Factorization Machines context-aware recommender Incremental MF online item recommender optimized by SGD
  3. Continuously recommend ”new” items to ”unforeseen” users Poor recommendation accuracy

    ! Yahoo! Labs — online ad [M. Aharon et al. RecSys 2013] Booking.com — hotel reservation [L. Bernardi et al. CBRecSys 2015]
 Rakuten — golf package reservation [R. Swezey and Y. Chung. CIKM 2015] 3 Challenge I: Persistent cold-start in e-commerce users’ rare activity changing persona short-lived items changing prices
  4. 4 Promising solution: Context-aware (feature-based) recommendation Occupation Browser Age Sex

    [ 0, 0, 1, 0, 0, 0, 0, 1, 22, 1 ] [ 0, 1, 1, 0, 500 ] Category Price Enriched inputs
  5. For 5 Factorization Machines (FMs; general predictor) for 
 context-aware

    recommendation global bias linear interaction [S. Rendle. ACM TIST, 3(3), 2012.] (Fig. 1) MF FMs
  6. For 6 Factorization Machines (FMs; general predictor) for 
 context-aware

    recommendation global bias linear interaction [S. Rendle. ACM TIST, 3(3), 2012.] (Fig. 1) MF FMs Persistent cold-start can be seen as:
 concept drift online algorithms are more effective
  7. 7 Challenge II: Online item recommendation Incrementally update model based

    on user-item interactions (e.g. click, buy, book; not explicit rating)
  8. 8 Classical static baseline: Matrix Factorization (MF) for item recommendation

    https://en.wikipedia.org/wiki/Collaborative_filtering R ≈ P users QT items × 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1.0 least-squared loss for binary output
  9. 9 Incremental Matrix Factorization (iMF) for 
 online item recommendation

    iMF S can be data stream update (user ID) update (item ID) for each observation Where are auxiliary features? [J. Vinagre et al. UMAP 2014] SGD update
  10. 10 Overview FMs (context-aware recommender) iMF (online item recommender) Incremental

    FMs (iFMs) (context-aware online item recommender) optimized by SGD as generic framework (i.e. baseline) classical online baseline modern context-aware baseline
  11. 11 Using FMs to item recommendation Solve MF with unique

    target value
 [J. Vinagre et al. In Proc. of UMAP 2014, pp. 459–470.] Solve FM as regression (least-square loss) with y = 1.0 1.0 For 1.0
  12. 12 Generalizing iMF to incremental FMs (iFMs) SGD update for

    FMs O( #non-zero in x × k ) S is data stream for each observation SGD update for MF update (user information) update (item information) update update update SGD update SGD update
  13. 13 Incremental adaptive regularization FMs have many hyperparameters: Adaptive regularization

    [S. Rendle. In Proc. of WSDM 2012, pp. 133-142.] Incremental adaptive regularization (“pseudo validation sample” idea) outdated immediately
  14. 14 Evaluation of online item recommender: Test-then-learn procedure 0) pre-train

    with initial 20% of samples [J. Vinagre, et al. In Proc. of REDD 2014.] 
 Evaluation of Recommender Systems in Streaming Environments, (u1 , i1 ) ɾɾɾ timestamped n-samples for evaluation ɾɾɾ (uk , ik ) (uk+3 , ik+3 ) For sample (u, i) 1) top-N recommendation for u 2) check if i is in top-N list 3) compute recall@N in 4) update model based on (u, i) (un , in )
  15. 15 Experiment: MovieLens 100k (1/2) Real-world movie rating dataset ▪

    Binarized: { 1, 2, 3, 4, 5 } => { 0, 1 } ▪ Users’ rare activity
  16. 16 Experiment: MovieLens 100k (2/2) MF < FMs & static

    < incremental
  17. 17 Experiment: Synthetic click (1/2) 5 ad variants w/ categories:

    { ad1, ad2 }, { ad3, ad4 }, { ad5 }
 ▪ Rule-based generator from [M. Aharon et al., RecSys2013]
 
 
 ▪ Item instability 500k impressions ▶ rule change ▶ 500k impressions
  18. 18 Experiment: Synthetic click (2/2) iFMs show higher adaptivity for

    changes
  19. 19 Overall results (5 times w/ different initial params.) Trade-off

    : efficiency vs context-awareness FMs are: slower better
  20. 20 Summary ▪ Key features: 
 positive-only feedback
 incremental adaptive

    regularization ▪ Use your appropriate competitor: FMs (context-aware recommender) iMF (online item recommender) Incremental FMs (iFMs) (context-aware online item recommender) ID-only Feature-based Static MF FMs Online iMF iFMs
  21. 21 Future work ▶ More competitors/datasets (time-stamped, rich in contexts)

    ▶ Other approaches for positive-only feedback (e.g. BPR) [J. Vinagre, et al. In Proc. of UMAP 2014.] (Fig. 1)
  22. Incremental Factorization Machines for Persistently Cold-starting Online Item Recommendation Takuya

    Kitazawa Graduate School of Information Science and Technology, The University of Tokyo RecProfile 2016
  23. 25 500k impressions ▶ rule change ▶ 500k impressions 3

    [M. Aharon et al., 
 RecSys2013] (Table 1) ⾢ most clicked ad rule
  24. 26 New feature insertion e.g. insert new user’s dimension