Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
EMNLP2015読み会:Effective Approaches to Attention-...
Search
tkng
October 24, 2015
Research
2
4k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
October 24, 2015
Tweet
Share
More Decks by tkng
See All by tkng
LSTMを用いた自然言語処理について
tkng
3
3.7k
自然言語処理と深層学習の最先端
tkng
16
7.7k
basis-of-optimization.pdf
tkng
1
1.4k
Other Decks in Research
See All in Research
日本語新聞記事を用いた大規模言語モデルの暗記定量化 / LLMC2025
upura
0
190
カスタマーサクセスの視点からAWS Summitの展示を考える~製品開発で活用できる勘所~
masakiokuda
2
190
Agentic AIとMCPを利用したサービス作成入門
mickey_kubo
0
550
CVPR2025論文紹介:Unboxed
murakawatakuya
0
150
[RSJ25] Enhancing VLA Performance in Understanding and Executing Free-form Instructions via Visual Prompt-based Paraphrasing
keio_smilab
PRO
0
110
まずはここから:Overleaf共同執筆・CopilotでAIコーディング入門・Codespacesで独立環境
matsui_528
2
500
[CV勉強会@関東 CVPR2025] VLM自動運転model S4-Driver
shinkyoto
2
480
Galileo: Learning Global & Local Features of Many Remote Sensing Modalities
satai
3
240
論文読み会 SNLP2025 Learning Dynamics of LLM Finetuning. In: ICLR 2025
s_mizuki_nlp
0
210
Towards a More Efficient Reasoning LLM: AIMO2 Solution Summary and Introduction to Fast-Math Models
analokmaus
2
800
データxデジタルマップで拓く ミラノ発・地域共創最前線
mapconcierge4agu
0
210
Minimax and Bayes Optimal Best-arm Identification: Adaptive Experimental Design for Treatment Choice
masakat0
0
170
Featured
See All Featured
The Success of Rails: Ensuring Growth for the Next 100 Years
eileencodes
46
7.6k
Producing Creativity
orderedlist
PRO
347
40k
Unsuck your backbone
ammeep
671
58k
Faster Mobile Websites
deanohume
309
31k
Optimizing for Happiness
mojombo
379
70k
Improving Core Web Vitals using Speculation Rules API
sergeychernyshev
18
1.1k
Let's Do A Bunch of Simple Stuff to Make Websites Faster
chriscoyier
507
140k
YesSQL, Process and Tooling at Scale
rocio
173
14k
The Cost Of JavaScript in 2023
addyosmani
53
8.9k
Refactoring Trust on Your Teams (GOTO; Chicago 2020)
rmw
35
3.1k
Music & Morning Musume
bryan
46
6.8k
Testing 201, or: Great Expectations
jmmastey
45
7.7k
Transcript
Effective Approaches to Attention-based Neural Machine Translation Authors: Minh-Thang LuongɹHieu
PhamɹChristopher D. Manning ಡΉਓ: ಙӬ೭ ਤશͯ͜ͷจ͔ΒҾ༻ &./-1ಡΈձ
ࣗݾհɿಙӬ೭ • Twitter ID: @tkng • εϚʔτχϡʔεגࣜձࣾͰNLPͬͯ·͢
ࠓͷจʁ • Effective Approaches to Attention-based Neural Machine Translation •
ڈ͙Β͍͔ΒྲྀߦΓ࢝Ίͨseq2seqܥͷख ๏ͷ֦ு
Seq2seq modelͱʁ • Encoder/Decoder modelͱݴ͏ • ༁ݩͷจΛݻఆͷϕΫτϧʹΤϯίʔυ ͯ͠ɺ͔ͦ͜Β༁ޙͷจΛσίʔυ͢Δ • ՄมͷσʔλऔΓѻ͍͕͍͠ͷͰɺ
͑ͯݻఆʹͯ͠͠·͏ͱ͍͏ൃ
Ͳ͏ͬͯݻఆʹΤϯίʔυ ͢Δͷʁ • recurrent neural networkΛ͏ • http://colah.github.io/posts/2015-08-Understanding-LSTMs/ • http://kaishengtai.github.io/static/slides/treelstm-acl2015.pdf
• LSTM = recurrent neural networkͷҰछ
Seq2seqϞσϧͰͷ༁
Seq2seq·ͰͷಓͷΓ (1) • Recurrent Continuous Translation Models (EMNLP2013) • Learning
Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation (EMNLP2014)
Seq2seq·ͰͷಓͷΓ (2) • Sequence to Sequence Learning with Neural Networks
(NIPS2014) • ൺֱతγϯϓϧͳStacked LSTM͕ྑ͍ੑೳΛ ࣔ͢͜ͱ͕࣮ݧͰࣔ͞Εͨ • ϏʔϜαʔνɺٯॱͰͷೖྗɺΞϯαϯϒϧ ͷ3छྨͷ͕ೖ͍ͬͯΔ
Seq2seqϞσϧͷऑ • จʹऑ͍ • ݻ༗໊ࢺ͕ೖΕସΘΔ
AttentionʹΑΔվળ [Bahdanau+ 2015] • DecodeͷࡍͷContextʹEncodeͷࡍͷ֤࣌ࠁ ʹ͓͚ΔӅΕঢ়ଶͷॏΈ͖Λ༻͍Δ • ॏΈࣗମRNNͰܭࢉ͢Δ
ࠓճͷจͷߩݙ • ৽͍͠attention (local attention) ΛఏҊͨ͠ • ༁ݩจʹ͓͍ͯɺҐஔɹ͔ΒલޙD୯ޠ ͷӅΕঢ়ଶͷॏΈ͖ΛऔΔ •
ॏΈͷܭࢉglobal attentionͷ߹ͱಉ༷ • ɹ1ͭͣͭਐΊ͍ͯ͘߹ʢlocal-mʣ ͱɺ͜ΕࣗମRNNʹ͢Δ߹ʢlocal- pʣͷ2ͭΛ࣮ݧ͍ͯ͠Δ pt pt
local attention
local attentionͷҹ • ޠॱ͕ࣅ͍ͯΔݴޠؒͰͷ༁ͳΒɺ໌Β͔ ʹ͜ͷํ͕ྑͦ͞͏ • ӳΈ͍ͨʹޠॱ͕େ͖͘ҧ͏߹ɺ Ґஔɹͷਪఆࣗମ͕͍͠λεΫʹͳͬͪΌ ͍ͦ͏… pt
࣮ݧ݁ՌɿWMT'14
࣮ݧ݁ՌɿWMT'14 • Α͘ݟΔͱɺlocal attentionͰͷੑೳ্ +0.9ϙΠϯτ • ଞͷςΫχοΫͰՔ͍ͰΔϙΠϯτ͕ଟ͍
࣮ݧ݁ՌɿWMT'15
͍͔ͭ͘༁αϯϓϧ
·ͱΊ • Seq2seqϞσϧͷ֦ுͱͯ͠ɺlocal attention ΛఏҊͨ͠ • ఏҊख๏͍͔ͭ͘ͷ࣮ݧʹ͓͍ͯɺState of the artͷੑೳΛୡͨ͠
ײ • Local attentionΛඍ • ྨࣅ͢Δख๏ͱ۩ମతʹͲ͏ҧ͏͔͕໌շʹ ॻ͔Ε͓ͯΓɺಡΈ͔ͬͨ͢ • AttentionΛཧղͰ͖ͯΑ͔ͬͨʢখฒײʣ