Slide 1

Slide 1 text

ࣗવݴޠॲཧͱਂ૚ֶशͷ࠷ઌ୺ ಙӬ୓೭ +VTU5FDI5BML

Slide 2

Slide 2 text

ࣗવݴޠॲཧͱਂ૚ֶशͷ࠷ઌ୺ ͷҰ෦෼Λ঺հ͠·͢ ಙӬ୓೭ (@tkng) +VTU5FDI5BML

Slide 3

Slide 3 text

ࣗݾ঺հɿಙӬ୓೭ • Twitter ID: @tkng • εϚʔτχϡʔεגࣜձࣾͰࣗવݴޠॲཧ΍ ը૾ॲཧΛ΍ͬͯ·͢

Slide 4

Slide 4 text

No content

Slide 5

Slide 5 text

ࣗવݴޠॲཧͱ͸ • ࣗવݴޠʢ≠ϓϩάϥϛϯάݴޠʣΛѻ͏෼໺ • ػց຋༁ • ࣭໰Ԡ౴ • จॻ෼ྨ • ߏจղੳɾ܎Γड͚ղੳ • ܗଶૉղੳɾ୯ޠ෼ׂ

Slide 6

Slide 6 text

ػց຋༁ͷྫ • Google຋༁ͷword lensػೳ IUUQHPPHMFUSBOTMBUFCMPHTQPUKQIBMMPIPMBPMBUPOFXNPSFQPXFSGVM@IUNM

Slide 7

Slide 7 text

࣭໰Ԡ౴ͷྫ • IBM Watson • Jeopardy!Ͱਓؒʹউར IUUQXXXOZUJNFTDPNTDJFODFKFPQBSEZXBUTPOIUNM

Slide 8

Slide 8 text

ਂ૚ֶशͱ͸ • ≒ χϡʔϥϧωοτ • ۙ೥ͷྲྀߦ͸ɺҎԼͷཧ༝ʹΑΔ • ܭࢉػͷੑೳ޲্ • ֶशσʔλͷ૿Ճ • ࠷దԽख๏ͳͲͷݚڀͷਐల

Slide 9

Slide 9 text

ࣗવݴޠॲཧͱ ਂ૚ֶशͷ࠷ઌ୺

Slide 10

Slide 10 text

Show, Attend and Tell: Neural Image Caption Generation with Visual Attention (Xu+, 2015) • ը૾ʹର͢Δղઆจͷੜ੒ IUUQLFMWJOYVHJUIVCJPQSPKFDUTDBQHFOIUNM

Slide 11

Slide 11 text

Show, Attend and Tell͸ Ͳ͏͍͏ख๏͔ • ҎԼͷ3ͭͷ૊Έ߹Θͤ • Convolutional Neural Network • Long Short Term Memory • Attention

Slide 12

Slide 12 text

Generating Images from Captions with Attention (Mansimov+, 2015) • Ωϟϓγϣϯ͔Βը૾Λੜ੒͢Δ • ࡉ໨ͰݟΕ͹ඈߦػʹݟ͑ͳ͘΋ͳ͍

Slide 13

Slide 13 text

Effective Approaches to Attention- based Neural Machine Translation (Bahdanau+, 2015) • Deep LearningΛ༻͍ͯػց຋༁ • Local Attentionͱ͍͏৽͍͠ख๏ΛఏҊ • ͍͔ͭ͘ͷݴޠϖΞͰɺstate of the artΛୡ੒ ࠷ߴਫ४

Slide 14

Slide 14 text

Ask Me Anything: Dynamic Memory Networks for Natural Language Processing (Kumar+, 2015) • ৽͍͠ϞσϧʢDynamic Memory Networksʣ ΛఏҊͨ͠ • Recurrent Neural NetworkΛ૊Έ߹ΘͤͨΑ ͏ͳϞσϧʹͳ͍ͬͯΔ • ࣭໰Ԡ౴ɺ඼ࢺλά෇͚ɺڞࢀরղੳɺධ൑ ෼ੳͰstate of the art

Slide 15

Slide 15 text

ਂ૚ֶशͷNLPʹ͓͚Δݱঢ় • ਫ਼౓໘Ͱ͸ɺଞͷख๏ͱେࠩ͸͍͍ͭͯͳ͍ • ը૾ॲཧ΍Ի੠ೝࣝͱҧ͏ • ػց຋༁΍࣭໰Ԡ౴͕γϯϓϧͳख๏Ͱղ͚ ΔΑ͏ʹͳͬͨ • จͷੜ੒͕Ͱ͖ΔΑ͏ʹͳͬͨ

Slide 16

Slide 16 text

ࠓޙ͸Ͳ͏ͳΔͷ͔ʁ • ਖ਼௚ɺΑ͘Θ͔Βͳ͍…… • ը૾΍ಈըͱ૊Έ߹Θͤͨݚڀ͸૿͑ͦ͏

Slide 17

Slide 17 text

࠷ઌ୺ʹ͍͍ͭͯͨ͘Ίʹ

Slide 18

Slide 18 text

3ͭʹߜͬͯղઆ͠·͢ • Neural Networkͷجૅ • Recurrent Neural Network • ಛʹGated Recurrent Unit • Attention

Slide 19

Slide 19 text

χϡʔϥϧωοτϫʔΫ = ؔ਺ • χϡʔϥϧωοτϫʔΫ͸ɺ͋Δछͷؔ਺Ͱ ͋Δͱߟ͑Δ͜ͱ͕Ͱ͖Δ • ೖग़ྗ͸ϕΫτϧ • ඍ෼Մೳ

Slide 20

Slide 20 text

γϯϓϧͳྫ͔Β࢝ΊΔ y = f(x) = W x

Slide 21

Slide 21 text

ग़ྗΛ0ʙ1ʹਖ਼نԽ͢Δ • y = softmax(f(x))

Slide 22

Slide 22 text

ଟ૚Խͯ͠ΈΑ͏ • y = softmax(g(f(x)))

Slide 23

Slide 23 text

Ͳ͕͜ϨΠϠʔʁ

Slide 24

Slide 24 text

౾஌ࣝ • ϨΠϠʔͱ͍͏ݴ༿ʹؾΛ͚ͭΑ͏ • ͲͬͪΛࢦͯ͠Δ͔ᐆດʢಡΉͱ͖ʹؾΛ ͚ͭΕ͹Θ͔Δ͕…ʣ • ϝδϟʔͳOSSͰ͸ɺؔ਺Λࢦ͢΋ͷ͕ଟ਺ ೿ʢCaffe, Torch, Chainer, TensorFlowʣ

Slide 25

Slide 25 text

Recurrent Neural Network • ࣌ܥྻʹฒͿཁૉΛ1ͭͣͭड͚औͬͯɺঢ়ଶ Λߋ৽͍ͯ͘͠ωοτϫʔΫͷ૯শ • ࠷ۙͱͯ΋ྲྀߦ͍ͯ͠Δ IUUQDPMBIHJUIVCJPQPTUT6OEFSTUBOEJOH-45.T

Slide 26

Slide 26 text

ͳͥRNN͕ྲྀߦ͍ͯ͠Δͷ͔ʁ • Մม௕ͷσʔλͷऔΓѻ͍͸೉͍͠ • RNNΛ࢖ͬͨseq2seqϞσϧʢEncoder/ DecoderϞσϧͱ΋ݺͿʣͰ͸Մม௕σʔλΛ ͏·͘औΓѻ͑Δࣄ͕Θ͔͖ͬͯͨ

Slide 27

Slide 27 text

Seq2seqϞσϧͱ͸ʁ • Մม௕ͷೖྗσʔλΛɺݻఆ௕ͷϕΫτϧʹ Τϯίʔυͯ͠ɺ͔ͦ͜Β຋༁ޙͷσʔλΛ σίʔυ͢Δ • ػց຋༁΍ࣗಈཁ໿ͳͲೖग़ྗͷ௕͕͞ҧ͏ λεΫͰۙ೥ݚڀ͕ਐΜͰ͍Δ

Slide 28

Slide 28 text

Seq2seqϞσϧͰͷ຋༁ 5IJT JT B QFO &04 ͜Ε ͸ ϖϯ Ͱ͢ &04 ͜Ε ͸ ϖϯ Ͱ͢

Slide 29

Slide 29 text

Seq2seqϞσϧͰͷ຋༁ 5IJT JT B QFO &04 ͜Ε ͸ ϖϯ Ͱ͢ &04 ͜Ε ͸ ϖϯ Ͱ͢ 5IJTJTBQFOΛݻఆ௕ʹ Τϯίʔυ͍ͯ͠Δʂ

Slide 30

Slide 30 text

Seq2seqϞσϧΛ຋༁ʹ࢖͏ͱʁ • ͔ͳΓ͏·͍͘͘ࣄ͕Θ͔͍ͬͯΔ • ͨͩ࣍͠ͷ༷ͳऑ఺͕͋Δ • ௕จʹऑ͍ • ݻ༗໊ࢺ͕ೖΕସΘΔ • ͜ΕΛղܾ͢Δͷ͕࣍ʹઆ໌͢ΔAttention

Slide 31

Slide 31 text

Attentionͱ͸ • σίʔυ࣌ʹΤϯίʔυ࣌ͷ৘ใΛগ͚ͩ͠ ࢀর͢ΔͨΊͷ࢓૊Έ • গ͚ͩ͠ = બ୒ͨ͠෦෼͚ͩΛݟΔ • Global AttentionͱLocal Attention͕͋Δ

Slide 32

Slide 32 text

Global Attention • ީิঢ়ଶͷॏΈ෇͖࿨ΛAttentionͱ͢Δ • ྺ࢙తʹ͸ͪ͜Βͷํ͕ͪΐͬͱݹ͍ 5IJT JT B QFO &04 ͜Ε ͸ ͜Ε ͸

Slide 33

Slide 33 text

Local Attention • Τϯίʔυ࣌ͷঢ়ଶΛ͍͔ͭ͘બ୒ͯ͠࢖͏ 5IJT JT B QFO &04 ͜Ε ͸ ͜Ε ͸

Slide 34

Slide 34 text

Attentionͷॱং • ஫໨Λ౰ͯΔॱং͸ɺGlobal AttentionͰ΋ Local AttentionͰ΋೉͍͠໰୊Ͱ͋Δ • Attentionͷॱং΋RNNͰֶशͨ͠Γ͢Δ • લ͔ΒॱʹAttentionΛ౰͍͚ͯͯͩ͘Ͱ΋ੑ ೳ͸޲্͢Δ

Slide 35

Slide 35 text

࣮ݧ݁ՌɿWMT'14

Slide 36

Slide 36 text

࣮ݧ݁ՌɿWMT'15

Slide 37

Slide 37 text

࣮ࡍͷ຋༁ͷྫ

Slide 38

Slide 38 text

͜͜·Ͱͷ·ͱΊ • جૅతͳχϡʔϥϧωοτϫʔΫͷղઆ • Recurrent Neural Network • Attention

Slide 39

Slide 39 text

ࠓ೔࿩͞ͳ͔ͬͨ͜ͱ • ֶशʢback propagation, minibatchʣ • ଛࣦؔ਺ʢlog loss, cross entropy lossʣ • ਖ਼ଇԽͷςΫχοΫ • dropout, batch normalization • ࠷దԽͷςΫχοΫ • RMSProp, AdaGrad, Adam • ֤छ׆ੑԽؔ਺ • (Very) Leaky ReLU, Maxout

Slide 40

Slide 40 text

ࠓޙͷΦεεϝ • ࣗ෼Ͱͳʹ͔࣮ݧͯ͠ΈΑ͏ • γϯϓϧͳྫͰ͍͍͔Β·ͣಈ͔͢ • ಈ͍ͨΒ࣍ʹࣗ෼Ͱվ଄ͯ͠ΈΔ • ͱʹ͔͘खΛಈ͔͢͜ͱ͕େࣄ • ࠷ॳ͔Β೉͗͢͠Δ͜ͱʹ͸खΛग़͞ͳ͍

Slide 41

Slide 41 text

࠷৽৘ใ΁ͷΞϯςφ (1) • TwitterͰػցֶशͳͲʹ͍ͭͯൃݴ͍ͯ͠Δ ਓΛϑΥϩʔ͢Δ • ͱΓ͋͑ͣ @hillbig • ͍͍ਓ͸ଞʹ΋ͨ͘͞Μ͍·͕͢ • ͋΍͍͠ਓ΋͍Δ͔Β஫ҙͯ͠Ͷ

Slide 42

Slide 42 text

࠷৽৘ใ΁ͷΞϯςφ (2) • ࿦จΛಡ΋͏ • ಡΉ͚ͩ࣌ؒͷແବͳ࿦จ΋͋ΔͷͰ஫ҙ • ࠷ॳͷ͏ͪ͸ɺ༗໊ͳֶձʢACL, EMNLP, ICML, NIPS, KDD, etc.ʣʹ௨ͬͯΔ࿦จʹ ߜ͕ͬͨΑ͍

Slide 43

Slide 43 text

࠷৽৘ใ΁ͷΞϯςφ (3) • ࿦จͷஶऀʹ஫໨͢Δ • ࿦จΛಡΜͰ͍Δ͏ͪʹɺࣗ෼͕໘ന͍ͱ ࢥ͏࿦จͷஶऀ͕Կਓ͔ग़ͯ͘Δ • ͦ͏͍͏ਓͷ৽͍͠࿦จ͸Ͳ͏ʹ͔ͯ͠ νΣοΫ͠Α͏

Slide 44

Slide 44 text

Take home messages • ؾ͕࣋ͪ੝Γ্͕ͬͯΔ͏ͪʹɺࣗ෼ͷखͰ ৭ʑ࣮ݧͯ͠ΈΑ͏ • ॳ৺ऀʹ͸Chainer͕Φεεϝ • ࠷৽৘ใ͸ωοτͰೖखͰ͖Δ • มͳํ޲ʹҙ͕ࣝߴ͍ਓʹ͸஫ҙ