Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
LSTMを用いた自然言語処理について
Search
tkng
January 27, 2016
Technology
3
3.6k
LSTMを用いた自然言語処理について
第3回TokyoCL勉強会 でのLSTMについての発表資料です
tkng
January 27, 2016
Tweet
Share
More Decks by tkng
See All by tkng
自然言語処理と深層学習の最先端
tkng
16
7.6k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
2
3.9k
basis-of-optimization.pdf
tkng
1
1.3k
Other Decks in Technology
See All in Technology
Amazon Route 53, 待ちに待った TLSAレコードのサポート開始
kenichinakamura
0
210
GraphRAG: What I Thought I Knew (But Didn’t)
sashimimochi
0
110
データ基盤におけるIaCの重要性とその運用
mtpooh
5
790
メンバーがオーナーシップを発揮しやすいチームづくり
ham0215
2
340
HCP TerraformとAzure:イオンスマートテクノロジーのインフラ革新 / HCP Terraform and Azure AEON Smart Technology's Infrastructure Innovation
aeonpeople
3
830
[SRE kaigi 2025] ガバメントクラウドに向けた開発と変化するSRE組織のあり方 / Development for Government Cloud and the Evolving Role of SRE Teams
kazeburo
3
1.4k
Enhancing SRE Using AI
yoshiiryo1
1
120
SREKaigi.pdf
_awache
2
3k
プロダクト価値を引き上げる、「課題の再定義」という習慣
moeka__c
0
160
サーバーレス環境における生成AI活用の可能性
mikanbox
1
160
15年入社者に聞く! これまでのCAのキャリアとこれから
kurochan
1
130
大学教員が押さえておくべき生成 AI の基礎と活用例〜より効率的な教育のために〜
soh9834
1
160
Featured
See All Featured
Building an army of robots
kneath
302
45k
How to Ace a Technical Interview
jacobian
276
23k
Evolution of real-time – Irina Nazarova, EuRuKo, 2024
irinanazarova
6
510
Cheating the UX When There Is Nothing More to Optimize - PixelPioneers
stephaniewalter
280
13k
Large-scale JavaScript Application Architecture
addyosmani
510
110k
Music & Morning Musume
bryan
46
6.3k
A designer walks into a library…
pauljervisheath
205
24k
Chrome DevTools: State of the Union 2024 - Debugging React & Beyond
addyosmani
3
260
Dealing with People You Can't Stand - Big Design 2015
cassininazir
365
25k
Intergalactic Javascript Robots from Outer Space
tanoku
270
27k
The Power of CSS Pseudo Elements
geoffreycrofte
74
5.4k
Building Better People: How to give real-time feedback that sticks.
wjessup
366
19k
Transcript
LSTMΛ༻͍ͨ ࣗવݴޠॲཧʹ͍ͭͯ ಙӬ೭ (@tkng) 5PLZP$-ษڧձୈճ!άʔάϧגࣜձࣾ
ࣗݾհ • Twitter: @tkng • ΧϨʔ͕͖Ͱ͢
ຊͷ • Recurrent Neural Network & LSTM • LSTMΛͬͨ࠷ۙͷݚڀࣄྫ •
LSTMΛͬͨಠ࣮ࣗݧ
Recurrent Neural Network • ࣌ܥྻʹฒͿཁૉΛ1ͭͣͭड͚औͬͯɺঢ়ଶ Λߋ৽͍ͯ͘͠ωοτϫʔΫͷ૯শ • LSTMRNNͷҰछͱݴ͑Δ IUUQDPMBIHJUIVCJPQPTUT6OEFSTUBOEJOH-45.T
Seq2seqϞσϧͱʁ • ՄมͷೖྗσʔλΛɺRecurrent Neural NetworkΛͬͯݻఆͷϕΫτϧʹΤϯίʔ υͯ͠ɺ͔ͦ͜Β༁ޙͷσʔλΛσίʔυ ͢Δ
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢ 5IJTJTBQFOΛݻఆʹ Τϯίʔυ͍ͯ͠Δʂ
Effective Approaches to Attention- based Neural Machine Translation (Bahdanau+, 2015)
• Seq2seqͰػց༁ • Local Attentionͱ͍͏৽͍͠ख๏ΛఏҊ • ͍͔ͭ͘ͷݴޠϖΞͰɺstate of the artΛୡ
A Neural Conversational Model (Vinyals+, 2015) • LSTMΛͬͯରγεςϜΛ࡞ͬͨΒͦΕͬ Ά͘ಈ͍ͨ
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
(Kumar+, 2015) • Dynamic Memory NetworksΛఏҊͨ͠ • Recurrent Neural NetworkΛΈ߹ΘͤͨΑ ͏ͳϞσϧʹͳ͍ͬͯΔ • ࣭Ԡɺࢺλά͚ɺڞࢀরղੳɺධ ੳͰstate of the art • ΄΅ಉ͡ϞσϧͰ৭ʑղ͚Δͷ͕͍͢͝ͱ ͍͏ओு
Show, Attend and Tell: Neural Image Caption Generation with Visual
Attention (Xu+, 2015) • ը૾ʹର͢Δղઆจͷੜ • CNN + LSTM + Attention IUUQLFMWJOYVHJUIVCJPQSPKFDUTDBQHFOIUNM
Semi-supervised Sequence Learning (Dai+, 2015) • LSTMΛͬͯ sentiment analysis ͳͲྨܥ
ͷλεΫΛ࣮ݧ • Language ModelͱSequence Autoencoderͷ2 ͭΛpretrainingͷख๏ͱͯ͠༻͍ɺ্هͷ࣮ݧ Ͱstate of the artΛߋ৽ • γϯϓϧͳख๏Ͱɺຯʹ͍͢͝
An Empirical Exploration of Recurrent Network Architectures ( Jozefowicz+, 2015)
• GoogleͷܭࢉػύϫʔʹΛݴΘͤͯɺ LSTMGRUͷੜϞσϧΛͨ͘͞ΜධՁͨ͠ • LSTMΛॳظԽ͢Δࡍɺforget gateͷbiasΛ1 ʹ͢Δ͜ͱΛڧ͘קΊ͍ͯΔ
ͭ·ΓɺࠓɺLSTM͕φ͍ʂ • ྨλεΫͰstate of the art͕ग़ͤΔ • Seq2seqͰจੜ͕Ͱ͖Δ Ͳ͏͍͏ཧ۶ͳͷ͔Α͘Θ͔Βͳ͍…
͔͜͜Βઌಠ࣮ࣗݧͷ
ͳʹΛ࣮ݧ͔ͨ͠ʁ • LSTMʹΑΔ Sentence Auto-encoder • ࣮ݧ͍Ζ͍ΖࡶͰ͢ ͜Ε ϖϯ
Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢
ͳΜͰ࣮ݧ͔ͨ͠ʁ • Seq2seq͓͠Ζ͍ٕज़͕ͩɺத͕Α͘ Θ͔Βͳ͍
Γ͍ͨ͜ͱ • Ͳ͏͍͏จΛؒҧ͍͑͢ͷ͔ʁ • ͳΜͰ։͍ׅͨހΛด͡ΒΕΔͷ͔ʁ • ͳΜͰೖྗΛٯʹͨ͠ํ͕͍͍ͷ͔ʁ • ࣅͨจ͕࡞Γग़͢ঢ়ଶࣅ͍ͯΔͷ͔ʁ •
ͳΜͰݻఆͷσʔλ͔ΒՄมͷग़ྗ͕Ͱ ͖Δͷ͔ʁ
σʔλ • ݸਓతʹूΊ͍ͯͨຊޠͷϒϩάσʔλ • ܇࿅ɿ 60ສจ (33MB) • ςετɿ 3ສ5ઍจ
(2MB)
࣮ݧઃఆʢ1ʣ • ޠኮʢor จࣈʣ80000 or 10000 • ೖྗ100࣍ݩͷembeddingʹม • LSTM1ɺঢ়ଶ200࣍ݩ
• ࠷దԽʹAdamΛ༻ • ίʔυ https://github.com/odashi/ chainer_examples Λར༻ͨ͠
࣮ݧઃఆʢ2ʣ • ೖྗΛ୯ޠ୯Ґʹ͢Δ͔ɺจࣈ୯Ґʹ͢Δ͔ • ೖྗΛٯॱʹ͢Δ͔ɺͦͷ··ೖΕΔ͔ • ߹ܭ4ύλʔϯΛ࣮ݧͨ͠ • ୯ޠ୯Ґ50epoch, ޠኮ80000ޠ
• จࣈ୯Ґ100epoch, ޠኮ10000จࣈ
ͲΜͳײ͡ͰֶशͰ͖Δ͔ʁ • trg = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏
ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = Χϥʔ ͱ ग़ ͷ Α͏ ͳ ࣄଶ ʹ ʹ ඇৗ ͳ ࢪઃ ɻ • hyp = ମ ͱ ࠃՈ ͷ Α͏ ͳ ৯ ײ ʹ ߦ͘ ͳ ୴ಹ ɻ • hyp = ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ୴ಹ ɻ • hyp = ೖࡳ ͱ φγ ͷ Α͏ ͳ ৯ ʹ ্ ͳ ຯ ɻ • hyp = ࣇۄ Ԃ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = PA ͠Ό͘ ͱ ٳܜ ͷ Α͏ ͳ ৯ ͗͢ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ ͳ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
จʹΑΔਖ਼ղͷมԽ
ਖ਼ॱͱٯॱͱͷൺֱ
ग़ྗΛٯॱʹͯ͠ΈΔͱ…ʁ
͜͜·Ͱͷߟ • จΛؒҧ͍͑͢໌Β͔Ͱ͋Δ • ٯॱͰೖྗͨ͠΄͏͕ੑೳ͕Α͍ • ΘΓʹग़ྗΛٯॱʹͯ͠Α͍ • ֶशΛ్தͰԼ͛ΔͷޮՌతͰ͋Δ
ඍົʹೖྗΛม͑ͯΈΔͱʁ ʮձࣾͰΠϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻʯ ೖྗɿձࣾͰϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰϯϑϧΤϯβ͕େྲྀͦʹ͍Δɻ ೖྗɿձࣾͰΠϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϑϧΤϯβ͕େྲྀ͍ͦ͠Δɻ ೖྗɿΠϯϑϧΤϯβ ग़ྗɿΠϯϑϧΤϯτ
ผͷྫͰṖͷ่յ ೖྗɿձࣾͰΠϯϑϧϯΤβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϯϑϧϯΤϨʢ໙Θ։ྲྀͩ૿ خɻ
ೲಘͰ͖Δؒҧ͍ྫ ೖྗɿࢲʰिؒগαϯσʔʱΛຖिߪಡ͠ ͍ͯ·͢ɻ ग़ྗɿࢲʰिؒগαϯσʔΛΛຖങಡ͠ ͍ͯ·͢ɻ
ೖྗΛม͑ͯΈͯͷߟ • ͍จͰࣦഊ͢Δ͜ͱ͕͋Δ • 1จࣈ่͚ͩΕΔ߹͋Δ͕ɺ్த͔Βେ่ յΛ࢝ΊΔ߹͋Δ • िͱΛؒҧ͑ͨΓɺ༻๏͕ࣅͨจࣈΛؒҧ ͍͑ͯΔྫݟΒΕΔ
ࠓޙͷ՝ • ͬͱͪΌΜͱ࣮ͨ͠ݧ • ΑΓൣͳύϥϝʔλʔ୳ࡧ • 2ɺ3ͷLSTMͷར༻ • dropoutͷར༻ •
ঢ়ଶͷՄࢹԽ
·ͱΊ • LSTMʹจΛ෮ݩͤͯ͞Έͨ • ࣮ࡍɺͦΕͳΓʹ෮ݩͰ͖Δ • ೖྗΛٯॱʹ͢ΔΘΓʹɺग़ྗΛٯॱʹ͠ ͯΑ͍͜ͱ͕Θ͔ͬͨ
࠷ۙͷؔ࿈ݚڀ • http://arxiv.org/abs/1506.02078 • LSTMͷՄࢹԽ • http://citeseerx.ist.psu.edu/viewdoc/download? doi=10.1.1.1.3412&rep=rep1&type=pdf • RNNͰจ຺ࣗ༝จ๏ΛύʔεͰ͖Δ
• http://www.aclweb.org/anthology/P/P15/ P15-1107.pdf • Sentence Auto encoderͷ࣮ݧ