Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
LSTMを用いた自然言語処理について
Search
tkng
January 27, 2016
Technology
3
3.6k
LSTMを用いた自然言語処理について
第3回TokyoCL勉強会 でのLSTMについての発表資料です
tkng
January 27, 2016
Tweet
Share
More Decks by tkng
See All by tkng
自然言語処理と深層学習の最先端
tkng
16
7.6k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
2
3.9k
basis-of-optimization.pdf
tkng
1
1.3k
Other Decks in Technology
See All in Technology
Snowflake女子会#3 Snowpipeの良さを5分で語るよ
lana2548
0
230
サイボウズフロントエンドエキスパートチームについて / FrontendExpert Team
cybozuinsideout
PRO
5
38k
小学3年生夏休みの自由研究「夏休みに Copilot で遊んでみた」
taichinakamura
0
150
AWS re:Invent 2024 ふりかえり
kongmingstrap
0
130
社内イベント管理システムを1週間でAKSからACAに移行した話し
shingo_kawahara
0
180
【re:Invent 2024 アプデ】 Prompt Routing の紹介
champ
0
140
レンジャーシステムズ | 会社紹介(採用ピッチ)
rssytems
0
150
20241220_S3 tablesの使い方を検証してみた
handy
4
390
LINEスキマニにおけるフロントエンド開発
lycorptech_jp
PRO
0
330
プロダクト開発を加速させるためのQA文化の築き方 / How to build QA culture to accelerate product development
mii3king
1
260
TSKaigi 2024 の登壇から広がったコミュニティ活動について
tsukuha
0
160
NilAway による静的解析で「10 億ドル」を節約する #kyotogo / Kyoto Go 56th
ytaka23
3
380
Featured
See All Featured
Build your cross-platform service in a week with App Engine
jlugia
229
18k
Fashionably flexible responsive web design (full day workshop)
malarkey
405
66k
The Cost Of JavaScript in 2023
addyosmani
45
7k
Helping Users Find Their Own Way: Creating Modern Search Experiences
danielanewman
29
2.3k
Bash Introduction
62gerente
608
210k
GitHub's CSS Performance
jonrohan
1030
460k
JavaScript: Past, Present, and Future - NDC Porto 2020
reverentgeek
47
5.1k
Git: the NoSQL Database
bkeepers
PRO
427
64k
Reflections from 52 weeks, 52 projects
jeffersonlam
347
20k
The MySQL Ecosystem @ GitHub 2015
samlambert
250
12k
Speed Design
sergeychernyshev
25
670
Writing Fast Ruby
sferik
628
61k
Transcript
LSTMΛ༻͍ͨ ࣗવݴޠॲཧʹ͍ͭͯ ಙӬ೭ (@tkng) 5PLZP$-ษڧձୈճ!άʔάϧגࣜձࣾ
ࣗݾհ • Twitter: @tkng • ΧϨʔ͕͖Ͱ͢
ຊͷ • Recurrent Neural Network & LSTM • LSTMΛͬͨ࠷ۙͷݚڀࣄྫ •
LSTMΛͬͨಠ࣮ࣗݧ
Recurrent Neural Network • ࣌ܥྻʹฒͿཁૉΛ1ͭͣͭड͚औͬͯɺঢ়ଶ Λߋ৽͍ͯ͘͠ωοτϫʔΫͷ૯শ • LSTMRNNͷҰछͱݴ͑Δ IUUQDPMBIHJUIVCJPQPTUT6OEFSTUBOEJOH-45.T
Seq2seqϞσϧͱʁ • ՄมͷೖྗσʔλΛɺRecurrent Neural NetworkΛͬͯݻఆͷϕΫτϧʹΤϯίʔ υͯ͠ɺ͔ͦ͜Β༁ޙͷσʔλΛσίʔυ ͢Δ
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢ 5IJTJTBQFOΛݻఆʹ Τϯίʔυ͍ͯ͠Δʂ
Effective Approaches to Attention- based Neural Machine Translation (Bahdanau+, 2015)
• Seq2seqͰػց༁ • Local Attentionͱ͍͏৽͍͠ख๏ΛఏҊ • ͍͔ͭ͘ͷݴޠϖΞͰɺstate of the artΛୡ
A Neural Conversational Model (Vinyals+, 2015) • LSTMΛͬͯରγεςϜΛ࡞ͬͨΒͦΕͬ Ά͘ಈ͍ͨ
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
(Kumar+, 2015) • Dynamic Memory NetworksΛఏҊͨ͠ • Recurrent Neural NetworkΛΈ߹ΘͤͨΑ ͏ͳϞσϧʹͳ͍ͬͯΔ • ࣭Ԡɺࢺλά͚ɺڞࢀরղੳɺධ ੳͰstate of the art • ΄΅ಉ͡ϞσϧͰ৭ʑղ͚Δͷ͕͍͢͝ͱ ͍͏ओு
Show, Attend and Tell: Neural Image Caption Generation with Visual
Attention (Xu+, 2015) • ը૾ʹର͢Δղઆจͷੜ • CNN + LSTM + Attention IUUQLFMWJOYVHJUIVCJPQSPKFDUTDBQHFOIUNM
Semi-supervised Sequence Learning (Dai+, 2015) • LSTMΛͬͯ sentiment analysis ͳͲྨܥ
ͷλεΫΛ࣮ݧ • Language ModelͱSequence Autoencoderͷ2 ͭΛpretrainingͷख๏ͱͯ͠༻͍ɺ্هͷ࣮ݧ Ͱstate of the artΛߋ৽ • γϯϓϧͳख๏Ͱɺຯʹ͍͢͝
An Empirical Exploration of Recurrent Network Architectures ( Jozefowicz+, 2015)
• GoogleͷܭࢉػύϫʔʹΛݴΘͤͯɺ LSTMGRUͷੜϞσϧΛͨ͘͞ΜධՁͨ͠ • LSTMΛॳظԽ͢Δࡍɺforget gateͷbiasΛ1 ʹ͢Δ͜ͱΛڧ͘קΊ͍ͯΔ
ͭ·ΓɺࠓɺLSTM͕φ͍ʂ • ྨλεΫͰstate of the art͕ग़ͤΔ • Seq2seqͰจੜ͕Ͱ͖Δ Ͳ͏͍͏ཧ۶ͳͷ͔Α͘Θ͔Βͳ͍…
͔͜͜Βઌಠ࣮ࣗݧͷ
ͳʹΛ࣮ݧ͔ͨ͠ʁ • LSTMʹΑΔ Sentence Auto-encoder • ࣮ݧ͍Ζ͍ΖࡶͰ͢ ͜Ε ϖϯ
Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢
ͳΜͰ࣮ݧ͔ͨ͠ʁ • Seq2seq͓͠Ζ͍ٕज़͕ͩɺத͕Α͘ Θ͔Βͳ͍
Γ͍ͨ͜ͱ • Ͳ͏͍͏จΛؒҧ͍͑͢ͷ͔ʁ • ͳΜͰ։͍ׅͨހΛด͡ΒΕΔͷ͔ʁ • ͳΜͰೖྗΛٯʹͨ͠ํ͕͍͍ͷ͔ʁ • ࣅͨจ͕࡞Γग़͢ঢ়ଶࣅ͍ͯΔͷ͔ʁ •
ͳΜͰݻఆͷσʔλ͔ΒՄมͷग़ྗ͕Ͱ ͖Δͷ͔ʁ
σʔλ • ݸਓతʹूΊ͍ͯͨຊޠͷϒϩάσʔλ • ܇࿅ɿ 60ສจ (33MB) • ςετɿ 3ສ5ઍจ
(2MB)
࣮ݧઃఆʢ1ʣ • ޠኮʢor จࣈʣ80000 or 10000 • ೖྗ100࣍ݩͷembeddingʹม • LSTM1ɺঢ়ଶ200࣍ݩ
• ࠷దԽʹAdamΛ༻ • ίʔυ https://github.com/odashi/ chainer_examples Λར༻ͨ͠
࣮ݧઃఆʢ2ʣ • ೖྗΛ୯ޠ୯Ґʹ͢Δ͔ɺจࣈ୯Ґʹ͢Δ͔ • ೖྗΛٯॱʹ͢Δ͔ɺͦͷ··ೖΕΔ͔ • ߹ܭ4ύλʔϯΛ࣮ݧͨ͠ • ୯ޠ୯Ґ50epoch, ޠኮ80000ޠ
• จࣈ୯Ґ100epoch, ޠኮ10000จࣈ
ͲΜͳײ͡ͰֶशͰ͖Δ͔ʁ • trg = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏
ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = Χϥʔ ͱ ग़ ͷ Α͏ ͳ ࣄଶ ʹ ʹ ඇৗ ͳ ࢪઃ ɻ • hyp = ମ ͱ ࠃՈ ͷ Α͏ ͳ ৯ ײ ʹ ߦ͘ ͳ ୴ಹ ɻ • hyp = ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ୴ಹ ɻ • hyp = ೖࡳ ͱ φγ ͷ Α͏ ͳ ৯ ʹ ্ ͳ ຯ ɻ • hyp = ࣇۄ Ԃ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = PA ͠Ό͘ ͱ ٳܜ ͷ Α͏ ͳ ৯ ͗͢ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ ͳ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
จʹΑΔਖ਼ղͷมԽ
ਖ਼ॱͱٯॱͱͷൺֱ
ग़ྗΛٯॱʹͯ͠ΈΔͱ…ʁ
͜͜·Ͱͷߟ • จΛؒҧ͍͑͢໌Β͔Ͱ͋Δ • ٯॱͰೖྗͨ͠΄͏͕ੑೳ͕Α͍ • ΘΓʹग़ྗΛٯॱʹͯ͠Α͍ • ֶशΛ్தͰԼ͛ΔͷޮՌతͰ͋Δ
ඍົʹೖྗΛม͑ͯΈΔͱʁ ʮձࣾͰΠϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻʯ ೖྗɿձࣾͰϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰϯϑϧΤϯβ͕େྲྀͦʹ͍Δɻ ೖྗɿձࣾͰΠϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϑϧΤϯβ͕େྲྀ͍ͦ͠Δɻ ೖྗɿΠϯϑϧΤϯβ ग़ྗɿΠϯϑϧΤϯτ
ผͷྫͰṖͷ่յ ೖྗɿձࣾͰΠϯϑϧϯΤβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϯϑϧϯΤϨʢ໙Θ։ྲྀͩ૿ خɻ
ೲಘͰ͖Δؒҧ͍ྫ ೖྗɿࢲʰिؒগαϯσʔʱΛຖिߪಡ͠ ͍ͯ·͢ɻ ग़ྗɿࢲʰिؒগαϯσʔΛΛຖങಡ͠ ͍ͯ·͢ɻ
ೖྗΛม͑ͯΈͯͷߟ • ͍จͰࣦഊ͢Δ͜ͱ͕͋Δ • 1จࣈ่͚ͩΕΔ߹͋Δ͕ɺ్த͔Βେ่ յΛ࢝ΊΔ߹͋Δ • िͱΛؒҧ͑ͨΓɺ༻๏͕ࣅͨจࣈΛؒҧ ͍͑ͯΔྫݟΒΕΔ
ࠓޙͷ՝ • ͬͱͪΌΜͱ࣮ͨ͠ݧ • ΑΓൣͳύϥϝʔλʔ୳ࡧ • 2ɺ3ͷLSTMͷར༻ • dropoutͷར༻ •
ঢ়ଶͷՄࢹԽ
·ͱΊ • LSTMʹจΛ෮ݩͤͯ͞Έͨ • ࣮ࡍɺͦΕͳΓʹ෮ݩͰ͖Δ • ೖྗΛٯॱʹ͢ΔΘΓʹɺग़ྗΛٯॱʹ͠ ͯΑ͍͜ͱ͕Θ͔ͬͨ
࠷ۙͷؔ࿈ݚڀ • http://arxiv.org/abs/1506.02078 • LSTMͷՄࢹԽ • http://citeseerx.ist.psu.edu/viewdoc/download? doi=10.1.1.1.3412&rep=rep1&type=pdf • RNNͰจ຺ࣗ༝จ๏ΛύʔεͰ͖Δ
• http://www.aclweb.org/anthology/P/P15/ P15-1107.pdf • Sentence Auto encoderͷ࣮ݧ