$30 off During Our Annual Pro Sale. View Details »
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
LSTMを用いた自然言語処理について
Search
tkng
January 27, 2016
Technology
3
3.7k
LSTMを用いた自然言語処理について
第3回TokyoCL勉強会 でのLSTMについての発表資料です
tkng
January 27, 2016
Tweet
Share
More Decks by tkng
See All by tkng
自然言語処理と深層学習の最先端
tkng
16
7.7k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
2
4k
basis-of-optimization.pdf
tkng
1
1.4k
Other Decks in Technology
See All in Technology
20251222_サンフランシスコサバイバル術
ponponmikankan
2
140
AI駆動開発の実践とその未来
eltociear
2
490
障害対応訓練、その前に
coconala_engineer
0
190
20251203_AIxIoTビジネス共創ラボ_第4回勉強会_BP山崎.pdf
iotcomjpadmin
0
130
ペアーズにおけるAIエージェント 基盤とText to SQLツールの紹介
hisamouna
2
1.6k
_第4回__AIxIoTビジネス共創ラボ紹介資料_20251203.pdf
iotcomjpadmin
0
130
なぜ あなたはそんなに re:Invent に行くのか?
miu_crescent
PRO
0
200
Identity Management for Agentic AI 解説
fujie
0
460
AI with TiDD
shiraji
1
270
M&Aで拡大し続けるGENDAのデータ活用を促すためのDatabricks権限管理 / AEON TECH HUB #22
genda
0
240
Amazon Connect アップデート! AIエージェントにMCPツールを設定してみた!
ysuzuki
0
130
TED_modeki_共創ラボ_20251203.pdf
iotcomjpadmin
0
150
Featured
See All Featured
[RailsConf 2023 Opening Keynote] The Magic of Rails
eileencodes
31
9.8k
Technical Leadership for Architectural Decision Making
baasie
0
180
The Success of Rails: Ensuring Growth for the Next 100 Years
eileencodes
47
7.9k
AI in Enterprises - Java and Open Source to the Rescue
ivargrimstad
0
1.1k
Agile Leadership in an Agile Organization
kimpetersen
PRO
0
55
Breaking role norms: Why Content Design is so much more than writing copy - Taylor Woolridge
uxyall
0
120
WENDY [Excerpt]
tessaabrams
8
35k
Designing for Performance
lara
610
69k
Skip the Path - Find Your Career Trail
mkilby
0
27
SEO in 2025: How to Prepare for the Future of Search
ipullrank
3
3.3k
Heart Work Chapter 1 - Part 1
lfama
PRO
3
35k
Evolving SEO for Evolving Search Engines
ryanjones
0
73
Transcript
LSTMΛ༻͍ͨ ࣗવݴޠॲཧʹ͍ͭͯ ಙӬ೭ (@tkng) 5PLZP$-ษڧձୈճ!άʔάϧגࣜձࣾ
ࣗݾհ • Twitter: @tkng • ΧϨʔ͕͖Ͱ͢
ຊͷ • Recurrent Neural Network & LSTM • LSTMΛͬͨ࠷ۙͷݚڀࣄྫ •
LSTMΛͬͨಠ࣮ࣗݧ
Recurrent Neural Network • ࣌ܥྻʹฒͿཁૉΛ1ͭͣͭड͚औͬͯɺঢ়ଶ Λߋ৽͍ͯ͘͠ωοτϫʔΫͷ૯শ • LSTMRNNͷҰछͱݴ͑Δ IUUQDPMBIHJUIVCJPQPTUT6OEFSTUBOEJOH-45.T
Seq2seqϞσϧͱʁ • ՄมͷೖྗσʔλΛɺRecurrent Neural NetworkΛͬͯݻఆͷϕΫτϧʹΤϯίʔ υͯ͠ɺ͔ͦ͜Β༁ޙͷσʔλΛσίʔυ ͢Δ
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢ 5IJTJTBQFOΛݻఆʹ Τϯίʔυ͍ͯ͠Δʂ
Effective Approaches to Attention- based Neural Machine Translation (Bahdanau+, 2015)
• Seq2seqͰػց༁ • Local Attentionͱ͍͏৽͍͠ख๏ΛఏҊ • ͍͔ͭ͘ͷݴޠϖΞͰɺstate of the artΛୡ
A Neural Conversational Model (Vinyals+, 2015) • LSTMΛͬͯରγεςϜΛ࡞ͬͨΒͦΕͬ Ά͘ಈ͍ͨ
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
(Kumar+, 2015) • Dynamic Memory NetworksΛఏҊͨ͠ • Recurrent Neural NetworkΛΈ߹ΘͤͨΑ ͏ͳϞσϧʹͳ͍ͬͯΔ • ࣭Ԡɺࢺλά͚ɺڞࢀরղੳɺධ ੳͰstate of the art • ΄΅ಉ͡ϞσϧͰ৭ʑղ͚Δͷ͕͍͢͝ͱ ͍͏ओு
Show, Attend and Tell: Neural Image Caption Generation with Visual
Attention (Xu+, 2015) • ը૾ʹର͢Δղઆจͷੜ • CNN + LSTM + Attention IUUQLFMWJOYVHJUIVCJPQSPKFDUTDBQHFOIUNM
Semi-supervised Sequence Learning (Dai+, 2015) • LSTMΛͬͯ sentiment analysis ͳͲྨܥ
ͷλεΫΛ࣮ݧ • Language ModelͱSequence Autoencoderͷ2 ͭΛpretrainingͷख๏ͱͯ͠༻͍ɺ্هͷ࣮ݧ Ͱstate of the artΛߋ৽ • γϯϓϧͳख๏Ͱɺຯʹ͍͢͝
An Empirical Exploration of Recurrent Network Architectures ( Jozefowicz+, 2015)
• GoogleͷܭࢉػύϫʔʹΛݴΘͤͯɺ LSTMGRUͷੜϞσϧΛͨ͘͞ΜධՁͨ͠ • LSTMΛॳظԽ͢Δࡍɺforget gateͷbiasΛ1 ʹ͢Δ͜ͱΛڧ͘קΊ͍ͯΔ
ͭ·ΓɺࠓɺLSTM͕φ͍ʂ • ྨλεΫͰstate of the art͕ग़ͤΔ • Seq2seqͰจੜ͕Ͱ͖Δ Ͳ͏͍͏ཧ۶ͳͷ͔Α͘Θ͔Βͳ͍…
͔͜͜Βઌಠ࣮ࣗݧͷ
ͳʹΛ࣮ݧ͔ͨ͠ʁ • LSTMʹΑΔ Sentence Auto-encoder • ࣮ݧ͍Ζ͍ΖࡶͰ͢ ͜Ε ϖϯ
Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢ &04 ͜Ε ϖϯ Ͱ͢
ͳΜͰ࣮ݧ͔ͨ͠ʁ • Seq2seq͓͠Ζ͍ٕज़͕ͩɺத͕Α͘ Θ͔Βͳ͍
Γ͍ͨ͜ͱ • Ͳ͏͍͏จΛؒҧ͍͑͢ͷ͔ʁ • ͳΜͰ։͍ׅͨހΛด͡ΒΕΔͷ͔ʁ • ͳΜͰೖྗΛٯʹͨ͠ํ͕͍͍ͷ͔ʁ • ࣅͨจ͕࡞Γग़͢ঢ়ଶࣅ͍ͯΔͷ͔ʁ •
ͳΜͰݻఆͷσʔλ͔ΒՄมͷग़ྗ͕Ͱ ͖Δͷ͔ʁ
σʔλ • ݸਓతʹूΊ͍ͯͨຊޠͷϒϩάσʔλ • ܇࿅ɿ 60ສจ (33MB) • ςετɿ 3ສ5ઍจ
(2MB)
࣮ݧઃఆʢ1ʣ • ޠኮʢor จࣈʣ80000 or 10000 • ೖྗ100࣍ݩͷembeddingʹม • LSTM1ɺঢ়ଶ200࣍ݩ
• ࠷దԽʹAdamΛ༻ • ίʔυ https://github.com/odashi/ chainer_examples Λར༻ͨ͠
࣮ݧઃఆʢ2ʣ • ೖྗΛ୯ޠ୯Ґʹ͢Δ͔ɺจࣈ୯Ґʹ͢Δ͔ • ೖྗΛٯॱʹ͢Δ͔ɺͦͷ··ೖΕΔ͔ • ߹ܭ4ύλʔϯΛ࣮ݧͨ͠ • ୯ޠ୯Ґ50epoch, ޠኮ80000ޠ
• จࣈ୯Ґ100epoch, ޠኮ10000จࣈ
ͲΜͳײ͡ͰֶशͰ͖Δ͔ʁ • trg = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏
ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = Χϥʔ ͱ ग़ ͷ Α͏ ͳ ࣄଶ ʹ ʹ ඇৗ ͳ ࢪઃ ɻ • hyp = ମ ͱ ࠃՈ ͷ Α͏ ͳ ৯ ײ ʹ ߦ͘ ͳ ୴ಹ ɻ • hyp = ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ୴ಹ ɻ • hyp = ೖࡳ ͱ φγ ͷ Α͏ ͳ ৯ ʹ ্ ͳ ຯ ɻ • hyp = ࣇۄ Ԃ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = PA ͠Ό͘ ͱ ٳܜ ͷ Α͏ ͳ ৯ ͗͢ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ ↑ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ ͳ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ • hyp = ͠Ό͘ ͠Ό͘ ͱ φγ ͷ Α͏ ͳ ৯ ײ ʹ ্ ͳ ຯ ɻ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
ֶशͷਐΉ༷ࢠʢจਖ਼ղʣ
จʹΑΔਖ਼ղͷมԽ
ਖ਼ॱͱٯॱͱͷൺֱ
ग़ྗΛٯॱʹͯ͠ΈΔͱ…ʁ
͜͜·Ͱͷߟ • จΛؒҧ͍͑͢໌Β͔Ͱ͋Δ • ٯॱͰೖྗͨ͠΄͏͕ੑೳ͕Α͍ • ΘΓʹग़ྗΛٯॱʹͯ͠Α͍ • ֶशΛ్தͰԼ͛ΔͷޮՌతͰ͋Δ
ඍົʹೖྗΛม͑ͯΈΔͱʁ ʮձࣾͰΠϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻʯ ೖྗɿձࣾͰϯϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰϯϑϧΤϯβ͕େྲྀͦʹ͍Δɻ ೖྗɿձࣾͰΠϑϧΤϯβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϑϧΤϯβ͕େྲྀ͍ͦ͠Δɻ ೖྗɿΠϯϑϧΤϯβ ग़ྗɿΠϯϑϧΤϯτ
ผͷྫͰṖͷ่յ ೖྗɿձࣾͰΠϯϑϧϯΤβ͕େྲྀߦ͍ͯ͠Δɻ ग़ྗɿձࣾͰΠϯϑϧϯΤϨʢ໙Θ։ྲྀͩ૿ خɻ
ೲಘͰ͖Δؒҧ͍ྫ ೖྗɿࢲʰिؒগαϯσʔʱΛຖिߪಡ͠ ͍ͯ·͢ɻ ग़ྗɿࢲʰिؒগαϯσʔΛΛຖങಡ͠ ͍ͯ·͢ɻ
ೖྗΛม͑ͯΈͯͷߟ • ͍จͰࣦഊ͢Δ͜ͱ͕͋Δ • 1จࣈ่͚ͩΕΔ߹͋Δ͕ɺ్த͔Βେ่ յΛ࢝ΊΔ߹͋Δ • िͱΛؒҧ͑ͨΓɺ༻๏͕ࣅͨจࣈΛؒҧ ͍͑ͯΔྫݟΒΕΔ
ࠓޙͷ՝ • ͬͱͪΌΜͱ࣮ͨ͠ݧ • ΑΓൣͳύϥϝʔλʔ୳ࡧ • 2ɺ3ͷLSTMͷར༻ • dropoutͷར༻ •
ঢ়ଶͷՄࢹԽ
·ͱΊ • LSTMʹจΛ෮ݩͤͯ͞Έͨ • ࣮ࡍɺͦΕͳΓʹ෮ݩͰ͖Δ • ೖྗΛٯॱʹ͢ΔΘΓʹɺग़ྗΛٯॱʹ͠ ͯΑ͍͜ͱ͕Θ͔ͬͨ
࠷ۙͷؔ࿈ݚڀ • http://arxiv.org/abs/1506.02078 • LSTMͷՄࢹԽ • http://citeseerx.ist.psu.edu/viewdoc/download? doi=10.1.1.1.3412&rep=rep1&type=pdf • RNNͰจ຺ࣗ༝จ๏ΛύʔεͰ͖Δ
• http://www.aclweb.org/anthology/P/P15/ P15-1107.pdf • Sentence Auto encoderͷ࣮ݧ