ゼロから作るDeep Learning 2 自然言語編 読書会 第10回

ゼロから作るDeep Learning 2 自然言語編 読書会 第10回

317294d9b76dc4ebd537c351ec1d3976?s=128

naomine-egawa

August 21, 2019
Tweet

Transcript

  1. θϩ͔Β࡞ΔDEEP LEARNING 2 ࣗવݴޠฤ ಡॻձ ୈ10ճ

  2. ࣗݾ঺հ • ߐ઒௚ๆ (Naomine, Egawa) / ؾ͍ͮͨΒ 40 ୅ /

    1ࣇͷύύ • େֶࡏֶதʹΞϧόΠτͰ IT ͷੈքʹɻԼ੥͚։ൃɺϑϦʔϥϯεͳͲΛ ܦͯɺݱࡏ͸ Amazon Web Service Japan ʹͯ Technical Account Manager ͱͯ͠αϙʔτۀ຿ʹैࣄɻ • ࠷ۙ৮͍ͬͯΔ΋ͷ͸ɺAWS, Docker, JavaʢStruts2...ʣ • AI ΋σʔλαΠΤϯε΋ະܦݧɻຊॻͷᶃΛಡΜͰ Deep Learning ͷ໘ന ͞ʹ໨֮ΊɺG ݕఆΛऔΔͳͲ͢Δɻ • ࠷ۙϋϚͬͨ΋ͷ͸ Amazon Prime Video ͷ "Ϗοάόϯ˒ηΦϦʔ" 2
  3. ͜ͷεϥΠυ͸ • 4 ষͷޙ൒ʢ4.3~4.4ʣΛಡΜͰࣗ෼͕ཧղͨ͜͠ͱͷαϚϦͰ͢ɻ • 4 ষ word2vec ͷߴ଎Խ •

    4.3 վྑ൛ word2vec ͷֶश • 4.4 word2vec ʹؔ͢Δ࢒ΓͷςʔϚ • ϖʔδͰݴ͏ͱ 160 ʙ 174 Ͱ͢ɻ 3
  4. վྑ൛ͷ࣮૷ • 4 ষͷલ൒ʢ4.1~4.2ʣͰ͸ɺ3 ষͷ word2vec Λվྑ͢Δ̎ͭͷߟ ͑ํΛಋೖͨ͠ɻ • Embedding

    ϨΠϠͱɺNegative Sampling • 4.3 Ͱ͸ɺ͜ΕΒΛ࣮૷͍ͯ͘͠ɻ 4
  5. • ΠχγϟϥΠζ • ೖྗ૚ΛEmbeddingϨΠϠʹมߋ • ग़ྗ૚ΛNegativeSamplingLossϨ ΠϠʹมߋ • ΢Οϯυ΢αΠζΛ 2

    ݻఆ͔ΒՄม ௕΁ɻ 5 ch03/simple_cbow.py -> ch04/cbow.py
  6. • forward ͱ backward • ΢Οϯυ΢αΠζ͕Մม௕ʹͳͬͨ͜ͱͰ in_layer ͕ϧʔϓʹͳͬͨ • ग़ྗ૚ͷ

    Negative Sampling ͱ Sigmoid with loss ͷ࣮૷͕̍ͭͷϨΠϠʹͳͬͨ • Embedding ϨΠϠͱ Negative Sampling with Loss ϨΠϠ͸Contextsʢࠨӈͷ୯ޠʣͱ Targetʢதԝͷਖ਼ղͷ୯ޠʣ͕ 1 hot ϕΫτϧͰ͸ͳ͘୯ޠ ID ΛऔΔ఺ʹ஫ҙ 6 ch03/simple_cbow.py -> ch04/cbow.py
  7. • มߋ఺ • Contexts ͱ Target ͕ 1 hot ϕΫτϧ͔Β

    ୯ޠ ID ʹมΘͬͨɻ • ͜Ε·Ͱͨͬͨ̓ͭͷ୯ޠ͔͠ͳ͍ίʔύεΛ࢖͍͕ͬͯͨɺ୯ޠ਺͕ଟ͍ PTBίʔύεΛ࢖͏Α͏ʹͨ͠ɻˠಡΈࠐΉίʔύε͕ҧ͏͚ͩ • ΦϓγϣϯͰGPUΛ࢖͑ΔΑ͏ʹͨ͠ɻˠP.52Ͱઆ໌ࡁΈ • ίʔύε΍ύϥϝʔλΛ͋ͱ͔Βར༻Ͱ͖ΔΑ͏ʹϑΝΠϧʹอଘ͢ΔΑ͏ ʹͨ͠ɻˠΦϒδΣΫτΛϑΝΠϧʹอଘ͢ΔϥΠϒϥϦ pickle Λ࢖͏ 7 ch03/train.py -> ch04/train.py
  8. • Contexts ͱ Target ͕ 1 hot ϕΫτϧ͔Β ୯ޠ ID

    ʹมΘͬͨɻ
 ˠ convert_one_hot() Ͱม׵͢Δ෦෼͕ແ͘ͳͬͨɻ 8 ch03/train.py -> ch04/train.py
  9. CBOWϞσϧͷධՁ • ࣅͨҙຯͷ୯ޠΛදࣔ͢Δ • 2ষͷ most_similar() ͷίʔυΛར༻ • ʮyouʯˠʮIʯʮweʯ •

    ʮyearʯˠʮmonthʯʮweekʯ • ʮtoyotaʯˠʮfordʯʮmazdaʯ • PTBͷΧ΢ϯτϕʔεͷͱ͖ʢP.90ʣͱಉ͡Α͏ͳ݁Ռʹʂ 9
  10. • ྨਪ໰୊ʢΞφϩδʔ໰୊ʣΛղ͘ • Ξφϩδʔ໰୊ͱ͸ˠʮking ʔ man ʴ woman ʹ queenʯ

    • word2vec ͷ୯ޠ͸ϕΫτϧͳͷͰɺΞφϩδʔ໰୊͸ϕΫτϧͷ ଍͠ࢉͱҾ͖ࢉͰ࣮૷Ͱ͖ΔɻʢΠϝʔδ͸P.166ͷਤʣ • ࣮૷ίʔυ͸ common/util.py ͷ analogy() 10
  11. • ࣮ࡍͷ݁Ռ • king:man = queen:? -> woman ਖ਼ղ •

    take:took = go:? -> went ਖ਼ղ • car:cars = child:? -> children ਖ਼ղ • good:better = bad:? -> more ෆਖ਼ղ • ্ͷਖ਼ղ̏ͭ͸ຊॻͷචऀ͕ҙਤతʹબΜͩ΋ͷͰ͋Γɺ࣮ࡍ͸ෆ ਖ਼ղͷ΄͏͕ଟ͍ɻཧ༝͸ɺPTBίʔύε͕খن໛ͷͨΊɻ • dataset/ptb.train.txt ͸ 88,7521 ϫʔυʢޠኮ͸ 10,000ʣ 11
  12. ࢒ΓͷςʔϚ • word2vec ͷԠ༻ྫ • ༧Ίେ͖ͳίʔύεͰֶशΛߦͬͨ෼ࢄදݱΛɺςΩετ෼ྨ΍จॻΫϥ ελϦϯάɺ඼ࢺλά෇͚ɺײ৘෼ੳλεΫͰར༻͢Δɻ • ୯ޠΛݻఆ௕ͷϕΫτϧʹม׵Ͱ͖ΔͱɺจষΛݻఆ௕ͷϕΫτϧʹม׵ Ͱ͖ΔΑ͏ʹͳΔɻ


    ˠ bag-of-words , RNN • จষΛݻఆ௕ͷϕΫτϧʹม׵Ͱ͖ΔΑ͏ʹͳΔͱɺจষΛҰൠతͳػց ֶशγεςϜͷΠϯϓοτͱͯ͠࢖༻Ͱ͖ΔΑ͏ʹͳΔɻ
 ˠ ֶशࡁΈͷ word2vec ͰϝʔϧΛϕΫτϧʹม׵͠ɺSVM ΍χϡʔϥ ϧωοτϫʔΫΛ࢖ͬͯ෼ྨ͢ΔͳͲ 12
  13. • ୯ޠϕΫτϧͷධՁํ๏ • ಛఆͷ໰୊ͷֶशʢྫ͑͹ײ৘෼ੳʣΛ͢Δͱ͖ʹɺຖճ word2vec ͷ ֶशΛ΍Δͷ͸ޮ཰͕ѱ͍ɻ • ʮੑೳͷߴ͍ word2vecʯ΋͘͠͸ɺʮ൚༻ੑ͕ߴ͍

    word2vecʯ͕ඞ ཁʹͳΔɻ
 ˠ ϞσϧͷੑೳΛධՁ͢Δखஈ͕ඞཁ • Semantics : ୯ޠͷҙຯΛྨਪ͢Δ
 ˠʮking : queen = actor : actressʯ • Syntax : ୯ޠͷܗଶ৘ใΛ໰͏
 ˠʮbad : worst = good : bestʯ 13
  14. 14 Ϟσϧ ࣍ݩ ਺ ίʔύεͷ αΠζ Semanti cs Syntax Total

    CBOW 300 10ԯ 61 61 61 skip- gram 300 16ԯ 16.1 52.6 36.1 CBOW 300 60ԯ 63.6 67.4 65.7 skip- gram 300 60ԯ 73.0 66.0 69.1 CBOW 1000 60ԯ 57.3 68.9 63.7 skip- gram 1000 60ԯ 66.1 65.1 65.6 • ੑೳൺֱ
 ˞ຊͷ࿦จ͸Χ΢ϯτϕʔεͱ word2vec ͷྑ͍ͱ͜ͲΓΛͨ͠ GloVe Ϟσϧͷ࿩ ΦϦδφϧ
  15. ͓·͚: PTBσʔληοτΛGPUͰֶश 15 CPU: Core i7 3770 GPU: RTX2060 →

    CPUͩͱऴΘΔؾ͕͠ͳ͍͕GPUͩͱ20෼ͩͬͨ ଞʹ΋গ͠ίʔυΛमਖ਼͢Δඞཁ͕͋ͬͨ https://github.com/naomine-egawa/oreilly-japan-deep-learning-from-scratch-2
  16. • ͓͢͢Ί৘ใݯ • http://jalammar.github.io/illustrated-word2vec/ • Πϥετ͕ݟ΍͍͢ʂ 16