Upgrade to Pro — share decks privately, control downloads, hide ads and more …

[読み会] TabNet: Attentive Interpretable Tabular Learning

mei28
January 05, 2021
120

[読み会] TabNet: Attentive Interpretable Tabular Learning

読み会資料
TabNet: Attentive Interpretable Tabular Learning(ICLR, 2020, rejected)

mei28

January 05, 2021
Tweet

More Decks by mei28

Transcript

  1.  • ஶऀ • Sercan O. Arik, Tomas Pfister •

    Google Cloud AI • ग़య: ArxivͷPreprint • ICLR 2020ͰϦδΣΫτ͞Εͨ࿦จ ࿦จ৘ใ
  2.  • Attentive transformer • ಛ௃ྔʹରͯ͠࢖͏MaskͷֶशΛߦ͏ɽ • Feature transformer •

    ಛ௃ྔͷม׵ɼ࣍εςοϓʹ࢖͏΋ͷΛܾΊΔɽ ఏҊख๏ ॏཁͳύʔπ
  3.  • • : աڈͷMͰ࢖ΘΕ͍ͯΔ͔ʁʹΑͬͯ มΘΔॏΈ(࣮૷Ͱ͸ར༻੍ݶΈ͍ͨͳ΋ͷ) • Sparsemax: softmaxʹࣅͨ׆ੑԽؔ਺ M[i]

    = sparsemax(P[i] ⋅ hi (a[i − 1])) P[i] ఏҊख๏ Attentive Transformer: ϚεΫͷֶशΛߦ͏ɽ
  4.  • ɼa͸࣍ͷεςοϓʹճ͞ΕΔ [d[i], a[i]] = fi (M[i] ⋅ f)

    ఏҊख๏ Feature Transformer: ೖྗΛม׵͠ɼ࣍ʹ࢖͏΋ͷΛܾΊΔ
  5.  • ಛ௃ྔͷॏཁ౓͸ϚεΫΛ࢖ͬͯܭࢉ͢Δ • ؆୯ʹܭࢉ͢ΔͨΊɼϚεΫͰ͸ͳ͘ಛ௃ྔΛ༻͍Δ ɹɹɹ ɹˠͲͷαϯϓϧ͕ॏཁ͔ʁ • → ಛ௃ྔͷॏཁ౓

    ηb [i] = Nd ∑ c ReLU(db,c [i]) Magg−b,j = ∑Ns teps i=1 ηb [i]Mb,j [i] ∑D j=1 ∑Nsteps i=1 ηb [i]Mb,j [i] ఏҊख๏ ղऍੑʹ͍ͭͯ
  6.  • ର߅ख๏: • ޯ഑ϒʔεςΟϯάܥ: LightGBM, XGBoost, CatBoost • NNϞσϧ

    • ͳʹͰൺ΂Δ͔ʁ • ςετσʔλʹର͢Δaccuracy • ϞσϧͷαΠζ ࣮ݧ ࣮ݧઃఆ
  7.  ͓·͚ LightGBM vs NN model vs TabNet LightGBM NN

    model • TabNet: Accuracy: 0.81, ROC-AUC: 0.78 https://github.com/mei28/playground_python/blob/main/notebooks/titanic.ipynb ϋΠύϥ͸ॳظ஋ͷ··Ͱ νϡʔχϯάΛߦͳ͍ͬͯͳ͍