Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
自然言語処理と深層学習の最先端
Search
tkng
January 15, 2016
Technology
16
7.6k
自然言語処理と深層学習の最先端
第4回 JustTechTalk の発表資料
tkng
January 15, 2016
Tweet
Share
More Decks by tkng
See All by tkng
LSTMを用いた自然言語処理について
tkng
3
3.6k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
2
3.9k
basis-of-optimization.pdf
tkng
1
1.3k
Other Decks in Technology
See All in Technology
15年入社者に聞く! これまでのCAのキャリアとこれから
kurochan
1
130
サービスローンチを成功させろ! 〜SREが教える30日間の攻略ガイド〜
mmmatsuda
2
3.3k
Agentic AI時代のプロダクトマネジメントことはじめ〜仮説検証編〜
masakazu178
0
190
色々なAWSサービス名の由来を調べてみた
iriikeita
0
150
ココナラのセキュリティ組織の体制・役割・今後目指す世界
coconala_engineer
0
170
メンバーがオーナーシップを発揮しやすいチームづくり
ham0215
2
340
新卒1年目、はじめてのアプリケーションサーバー【IBM WebSphere Liberty】
ktgrryt
0
190
GraphRAG: What I Thought I Knew (But Didn’t)
sashimimochi
0
110
2024AWSで個人的にアツかったアップデート
nagisa53
1
180
インフラコストとセキュリティ課題解決のためのリアーキテクチャリング / srekaigi2025
hgsgtk
3
3.3k
実践している探索的テストの進め方 #jasstnano
makky_tyuyan
1
120
DMMブックスへのTipKit導入
ttyi2
1
150
Featured
See All Featured
Stop Working from a Prison Cell
hatefulcrawdad
267
20k
Bash Introduction
62gerente
610
210k
It's Worth the Effort
3n
184
28k
JavaScript: Past, Present, and Future - NDC Porto 2020
reverentgeek
47
5.1k
A Modern Web Designer's Workflow
chriscoyier
693
190k
Mobile First: as difficult as doing things right
swwweet
222
9k
実際に使うSQLの書き方 徹底解説 / pgcon21j-tutorial
soudai
174
51k
"I'm Feeling Lucky" - Building Great Search Experiences for Today's Users (#IAC19)
danielanewman
226
22k
4 Signs Your Business is Dying
shpigford
182
22k
Fantastic passwords and where to find them - at NoRuKo
philnash
50
3k
Embracing the Ebb and Flow
colly
84
4.5k
I Don’t Have Time: Getting Over the Fear to Launch Your Podcast
jcasabona
30
2.1k
Transcript
ࣗવݴޠॲཧͱਂֶशͷ࠷ઌ ಙӬ೭ +VTU5FDI5BML
ࣗવݴޠॲཧͱਂֶशͷ࠷ઌ ͷҰ෦Λհ͠·͢ ಙӬ೭ (@tkng) +VTU5FDI5BML
ࣗݾհɿಙӬ೭ • Twitter ID: @tkng • εϚʔτχϡʔεגࣜձࣾͰࣗવݴޠॲཧ ը૾ॲཧΛͬͯ·͢
None
ࣗવݴޠॲཧͱ • ࣗવݴޠʢ≠ϓϩάϥϛϯάݴޠʣΛѻ͏ • ػց༁ • ࣭Ԡ • จॻྨ •
ߏจղੳɾΓड͚ղੳ • ܗଶૉղੳɾ୯ޠׂ
ػց༁ͷྫ • Google༁ͷword lensػೳ IUUQHPPHMFUSBOTMBUFCMPHTQPUKQIBMMPIPMBPMBUPOFXNPSFQPXFSGVM@IUNM
࣭Ԡͷྫ • IBM Watson • Jeopardy!Ͱਓؒʹউར IUUQXXXOZUJNFTDPNTDJFODFKFPQBSEZXBUTPOIUNM
ਂֶशͱ • ≒ χϡʔϥϧωοτ • ۙͷྲྀߦɺҎԼͷཧ༝ʹΑΔ • ܭࢉػͷੑೳ্ • ֶशσʔλͷ૿Ճ
• ࠷దԽख๏ͳͲͷݚڀͷਐల
ࣗવݴޠॲཧͱ ਂֶशͷ࠷ઌ
Show, Attend and Tell: Neural Image Caption Generation with Visual
Attention (Xu+, 2015) • ը૾ʹର͢Δղઆจͷੜ IUUQLFMWJOYVHJUIVCJPQSPKFDUTDBQHFOIUNM
Show, Attend and Tell Ͳ͏͍͏ख๏͔ • ҎԼͷ3ͭͷΈ߹Θͤ • Convolutional Neural
Network • Long Short Term Memory • Attention
Generating Images from Captions with Attention (Mansimov+, 2015) • Ωϟϓγϣϯ͔Βը૾Λੜ͢Δ
• ࡉͰݟΕඈߦػʹݟ͑ͳ͘ͳ͍
Effective Approaches to Attention- based Neural Machine Translation (Bahdanau+, 2015)
• Deep LearningΛ༻͍ͯػց༁ • Local Attentionͱ͍͏৽͍͠ख๏ΛఏҊ • ͍͔ͭ͘ͷݴޠϖΞͰɺstate of the artΛୡ ࠷ߴਫ४
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
(Kumar+, 2015) • ৽͍͠ϞσϧʢDynamic Memory Networksʣ ΛఏҊͨ͠ • Recurrent Neural NetworkΛΈ߹ΘͤͨΑ ͏ͳϞσϧʹͳ͍ͬͯΔ • ࣭Ԡɺࢺλά͚ɺڞࢀরղੳɺධ ੳͰstate of the art
ਂֶशͷNLPʹ͓͚Δݱঢ় • ਫ਼໘Ͱɺଞͷख๏ͱେ͍͍ࠩͭͯͳ͍ • ը૾ॲཧԻೝࣝͱҧ͏ • ػց༁࣭Ԡ͕γϯϓϧͳख๏Ͱղ͚ ΔΑ͏ʹͳͬͨ • จͷੜ͕Ͱ͖ΔΑ͏ʹͳͬͨ
ࠓޙͲ͏ͳΔͷ͔ʁ • ਖ਼ɺΑ͘Θ͔Βͳ͍…… • ը૾ಈըͱΈ߹Θͤͨݚڀ૿͑ͦ͏
࠷ઌʹ͍͍ͭͯͨ͘Ίʹ
3ͭʹߜͬͯղઆ͠·͢ • Neural Networkͷجૅ • Recurrent Neural Network • ಛʹGated
Recurrent Unit • Attention
χϡʔϥϧωοτϫʔΫ = ؔ • χϡʔϥϧωοτϫʔΫɺ͋ΔछͷؔͰ ͋Δͱߟ͑Δ͜ͱ͕Ͱ͖Δ • ೖग़ྗϕΫτϧ • ඍՄೳ
γϯϓϧͳྫ͔Β࢝ΊΔ y = f(x) = W x
ग़ྗΛ0ʙ1ʹਖ਼نԽ͢Δ • y = softmax(f(x))
ଟԽͯ͠ΈΑ͏ • y = softmax(g(f(x)))
Ͳ͕͜ϨΠϠʔʁ
౾ࣝ • ϨΠϠʔͱ͍͏ݴ༿ʹؾΛ͚ͭΑ͏ • ͲͬͪΛࢦͯ͠Δ͔ᐆດʢಡΉͱ͖ʹؾΛ ͚ͭΕΘ͔Δ͕…ʣ • ϝδϟʔͳOSSͰɺؔΛࢦ͢ͷ͕ଟ ʢCaffe, Torch,
Chainer, TensorFlowʣ
Recurrent Neural Network • ࣌ܥྻʹฒͿཁૉΛ1ͭͣͭड͚औͬͯɺঢ়ଶ Λߋ৽͍ͯ͘͠ωοτϫʔΫͷ૯শ • ࠷ۙͱͯྲྀߦ͍ͯ͠Δ IUUQDPMBIHJUIVCJPQPTUT6OEFSTUBOEJOH-45.T
ͳͥRNN͕ྲྀߦ͍ͯ͠Δͷ͔ʁ • ՄมͷσʔλͷऔΓѻ͍͍͠ • RNNΛͬͨseq2seqϞσϧʢEncoder/ DecoderϞσϧͱݺͿʣͰՄมσʔλΛ ͏·͘औΓѻ͑Δࣄ͕Θ͔͖ͬͯͨ
Seq2seqϞσϧͱʁ • ՄมͷೖྗσʔλΛɺݻఆͷϕΫτϧʹ Τϯίʔυͯ͠ɺ͔ͦ͜Β༁ޙͷσʔλΛ σίʔυ͢Δ • ػց༁ࣗಈཁͳͲೖग़ྗͷ͕͞ҧ͏ λεΫͰۙݚڀ͕ਐΜͰ͍Δ
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢
Seq2seqϞσϧͰͷ༁ 5IJT JT B QFO &04 ͜Ε ϖϯ Ͱ͢
&04 ͜Ε ϖϯ Ͱ͢ 5IJTJTBQFOΛݻఆʹ Τϯίʔυ͍ͯ͠Δʂ
Seq2seqϞσϧΛ༁ʹ͏ͱʁ • ͔ͳΓ͏·͍͘͘ࣄ͕Θ͔͍ͬͯΔ • ͨͩ࣍͠ͷ༷ͳऑ͕͋Δ • จʹऑ͍ • ݻ༗໊ࢺ͕ೖΕସΘΔ •
͜ΕΛղܾ͢Δͷ͕࣍ʹઆ໌͢ΔAttention
Attentionͱ • σίʔυ࣌ʹΤϯίʔυ࣌ͷใΛগ͚ͩ͠ ࢀর͢ΔͨΊͷΈ • গ͚ͩ͠ = બͨ͠෦͚ͩΛݟΔ • Global
AttentionͱLocal Attention͕͋Δ
Global Attention • ީิঢ়ଶͷॏΈ͖ΛAttentionͱ͢Δ • ྺ࢙తʹͪ͜Βͷํ͕ͪΐͬͱݹ͍ 5IJT JT B QFO
&04 ͜Ε ͜Ε
Local Attention • Τϯίʔυ࣌ͷঢ়ଶΛ͍͔ͭ͘બͯ͠͏ 5IJT JT B QFO &04 ͜Ε
͜Ε
Attentionͷॱং • ΛͯΔॱংɺGlobal AttentionͰ Local AttentionͰ͍͠Ͱ͋Δ • AttentionͷॱংRNNͰֶशͨ͠Γ͢Δ • લ͔ΒॱʹAttentionΛ͍͚ͯͯͩ͘Ͱੑ
ೳ্͢Δ
࣮ݧ݁ՌɿWMT'14
࣮ݧ݁ՌɿWMT'15
࣮ࡍͷ༁ͷྫ
͜͜·Ͱͷ·ͱΊ • جૅతͳχϡʔϥϧωοτϫʔΫͷղઆ • Recurrent Neural Network • Attention
ࠓ͞ͳ͔ͬͨ͜ͱ • ֶशʢback propagation, minibatchʣ • ଛࣦؔʢlog loss, cross entropy
lossʣ • ਖ਼ଇԽͷςΫχοΫ • dropout, batch normalization • ࠷దԽͷςΫχοΫ • RMSProp, AdaGrad, Adam • ֤छ׆ੑԽؔ • (Very) Leaky ReLU, Maxout
ࠓޙͷΦεεϝ • ࣗͰͳʹ͔࣮ݧͯ͠ΈΑ͏ • γϯϓϧͳྫͰ͍͍͔Β·ͣಈ͔͢ • ಈ͍ͨΒ࣍ʹࣗͰվͯ͠ΈΔ • ͱʹ͔͘खΛಈ͔͢͜ͱ͕େࣄ •
࠷ॳ͔Β͗͢͠Δ͜ͱʹखΛग़͞ͳ͍
࠷৽ใͷΞϯςφ (1) • TwitterͰػցֶशͳͲʹ͍ͭͯൃݴ͍ͯ͠Δ ਓΛϑΥϩʔ͢Δ • ͱΓ͋͑ͣ @hillbig • ͍͍ਓଞʹͨ͘͞Μ͍·͕͢
• ͍͋͠ਓ͍Δ͔Βҙͯ͠Ͷ
࠷৽ใͷΞϯςφ (2) • จΛಡ͏ • ಡΉ͚ͩ࣌ؒͷແବͳจ͋ΔͷͰҙ • ࠷ॳͷ͏ͪɺ༗໊ͳֶձʢACL, EMNLP, ICML,
NIPS, KDD, etc.ʣʹ௨ͬͯΔจʹ ߜ͕ͬͨΑ͍
࠷৽ใͷΞϯςφ (3) • จͷஶऀʹ͢Δ • จΛಡΜͰ͍Δ͏ͪʹɺ͕ࣗ໘ന͍ͱ ࢥ͏จͷஶऀ͕Կਓ͔ग़ͯ͘Δ • ͦ͏͍͏ਓͷ৽͍͠จͲ͏ʹ͔ͯ͠ νΣοΫ͠Α͏
Take home messages • ؾ͕࣋ͪΓ্͕ͬͯΔ͏ͪʹɺࣗͷखͰ ৭ʑ࣮ݧͯ͠ΈΑ͏ • ॳ৺ऀʹChainer͕Φεεϝ • ࠷৽ใωοτͰೖखͰ͖Δ
• มͳํʹҙ͕ࣝߴ͍ਓʹҙ