Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
EMNLP2015読み会:Effective Approaches to Attention-...
Search
tkng
October 24, 2015
Research
2
4k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
October 24, 2015
Tweet
Share
More Decks by tkng
See All by tkng
LSTMを用いた自然言語処理について
tkng
3
3.7k
自然言語処理と深層学習の最先端
tkng
16
7.7k
basis-of-optimization.pdf
tkng
1
1.4k
Other Decks in Research
See All in Research
NLP Colloquium
junokim
1
180
20250605_新交通システム推進議連_熊本都市圏「車1割削減、渋滞半減、公共交通2倍」から考える地方都市交通政策
trafficbrain
0
640
AIによる画像認識技術の進化 -25年の技術変遷を振り返る-
hf149
7
3.7k
When Submarine Cables Go Dark: Examining the Web Services Resilience Amid Global Internet Disruptions
irvin
0
260
20250725-bet-ai-day
cipepser
2
330
ストレス計測方法の確立に向けたマルチモーダルデータの活用
yurikomium
0
980
Submeter-level land cover mapping of Japan
satai
3
180
IMC の細かすぎる話 2025
smly
2
490
とあるSREの博士「過程」 / A Certain SRE’s Ph.D. Journey
yuukit
8
3.8k
SatCLIP: Global, General-Purpose Location Embeddings with Satellite Imagery
satai
3
250
データxデジタルマップで拓く ミラノ発・地域共創最前線
mapconcierge4agu
0
200
公立高校入試等に対する受入保留アルゴリズム(DA)導入の提言
shunyanoda
0
6.5k
Featured
See All Featured
Building an army of robots
kneath
306
45k
The Language of Interfaces
destraynor
158
25k
How To Stay Up To Date on Web Technology
chriscoyier
790
250k
Code Reviewing Like a Champion
maltzj
524
40k
CoffeeScript is Beautiful & I Never Want to Write Plain JavaScript Again
sstephenson
161
15k
Speed Design
sergeychernyshev
32
1.1k
Visualizing Your Data: Incorporating Mongo into Loggly Infrastructure
mongodb
47
9.6k
Git: the NoSQL Database
bkeepers
PRO
431
65k
Refactoring Trust on Your Teams (GOTO; Chicago 2020)
rmw
34
3.1k
個人開発の失敗を避けるイケてる考え方 / tips for indie hackers
panda_program
110
19k
Keith and Marios Guide to Fast Websites
keithpitt
411
22k
Build your cross-platform service in a week with App Engine
jlugia
231
18k
Transcript
Effective Approaches to Attention-based Neural Machine Translation Authors: Minh-Thang LuongɹHieu
PhamɹChristopher D. Manning ಡΉਓ: ಙӬ೭ ਤશͯ͜ͷจ͔ΒҾ༻ &./-1ಡΈձ
ࣗݾհɿಙӬ೭ • Twitter ID: @tkng • εϚʔτχϡʔεגࣜձࣾͰNLPͬͯ·͢
ࠓͷจʁ • Effective Approaches to Attention-based Neural Machine Translation •
ڈ͙Β͍͔ΒྲྀߦΓ࢝Ίͨseq2seqܥͷख ๏ͷ֦ு
Seq2seq modelͱʁ • Encoder/Decoder modelͱݴ͏ • ༁ݩͷจΛݻఆͷϕΫτϧʹΤϯίʔυ ͯ͠ɺ͔ͦ͜Β༁ޙͷจΛσίʔυ͢Δ • ՄมͷσʔλऔΓѻ͍͕͍͠ͷͰɺ
͑ͯݻఆʹͯ͠͠·͏ͱ͍͏ൃ
Ͳ͏ͬͯݻఆʹΤϯίʔυ ͢Δͷʁ • recurrent neural networkΛ͏ • http://colah.github.io/posts/2015-08-Understanding-LSTMs/ • http://kaishengtai.github.io/static/slides/treelstm-acl2015.pdf
• LSTM = recurrent neural networkͷҰछ
Seq2seqϞσϧͰͷ༁
Seq2seq·ͰͷಓͷΓ (1) • Recurrent Continuous Translation Models (EMNLP2013) • Learning
Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation (EMNLP2014)
Seq2seq·ͰͷಓͷΓ (2) • Sequence to Sequence Learning with Neural Networks
(NIPS2014) • ൺֱతγϯϓϧͳStacked LSTM͕ྑ͍ੑೳΛ ࣔ͢͜ͱ͕࣮ݧͰࣔ͞Εͨ • ϏʔϜαʔνɺٯॱͰͷೖྗɺΞϯαϯϒϧ ͷ3छྨͷ͕ೖ͍ͬͯΔ
Seq2seqϞσϧͷऑ • จʹऑ͍ • ݻ༗໊ࢺ͕ೖΕସΘΔ
AttentionʹΑΔվળ [Bahdanau+ 2015] • DecodeͷࡍͷContextʹEncodeͷࡍͷ֤࣌ࠁ ʹ͓͚ΔӅΕঢ়ଶͷॏΈ͖Λ༻͍Δ • ॏΈࣗମRNNͰܭࢉ͢Δ
ࠓճͷจͷߩݙ • ৽͍͠attention (local attention) ΛఏҊͨ͠ • ༁ݩจʹ͓͍ͯɺҐஔɹ͔ΒલޙD୯ޠ ͷӅΕঢ়ଶͷॏΈ͖ΛऔΔ •
ॏΈͷܭࢉglobal attentionͷ߹ͱಉ༷ • ɹ1ͭͣͭਐΊ͍ͯ͘߹ʢlocal-mʣ ͱɺ͜ΕࣗମRNNʹ͢Δ߹ʢlocal- pʣͷ2ͭΛ࣮ݧ͍ͯ͠Δ pt pt
local attention
local attentionͷҹ • ޠॱ͕ࣅ͍ͯΔݴޠؒͰͷ༁ͳΒɺ໌Β͔ ʹ͜ͷํ͕ྑͦ͞͏ • ӳΈ͍ͨʹޠॱ͕େ͖͘ҧ͏߹ɺ Ґஔɹͷਪఆࣗମ͕͍͠λεΫʹͳͬͪΌ ͍ͦ͏… pt
࣮ݧ݁ՌɿWMT'14
࣮ݧ݁ՌɿWMT'14 • Α͘ݟΔͱɺlocal attentionͰͷੑೳ্ +0.9ϙΠϯτ • ଞͷςΫχοΫͰՔ͍ͰΔϙΠϯτ͕ଟ͍
࣮ݧ݁ՌɿWMT'15
͍͔ͭ͘༁αϯϓϧ
·ͱΊ • Seq2seqϞσϧͷ֦ுͱͯ͠ɺlocal attention ΛఏҊͨ͠ • ఏҊख๏͍͔ͭ͘ͷ࣮ݧʹ͓͍ͯɺState of the artͷੑೳΛୡͨ͠
ײ • Local attentionΛඍ • ྨࣅ͢Δख๏ͱ۩ମతʹͲ͏ҧ͏͔͕໌շʹ ॻ͔Ε͓ͯΓɺಡΈ͔ͬͨ͢ • AttentionΛཧղͰ͖ͯΑ͔ͬͨʢখฒײʣ