Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
Search
tkng
October 24, 2015
Research
2
3.7k
EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation
tkng
October 24, 2015
Tweet
Share
More Decks by tkng
See All by tkng
LSTMを用いた自然言語処理について
tkng
3
3.6k
自然言語処理と深層学習の最先端
tkng
16
7.5k
basis-of-optimization.pdf
tkng
1
1.2k
Other Decks in Research
See All in Research
床面圧力センサ開発における感圧導電シート分離方式の検討 / WISS2023
yumulab
0
260
Bridging Continuous and Discrete Spaces: Interpretable Sentence Representation Learning via Compositional Operations
rudorudo11
0
160
説明可能AI:代表的手法と最近の動向
yuyay
1
580
Trezor Safe 3 ファーストインプレッション
toshihr
0
180
LLMマルチエージェントを俯瞰する
masatoto
26
15k
精神疾患患者のアクティビティデータを利用したリハビリテーションのためのシステムに関する研究
comfortdesignlab
0
140
Equivalence of Geodesics and Importance Weighting from the Perspective of Information Geometry
mkimura
0
130
NeurIPS-23 参加報告 + DPO 解説
akifumi_wachi
4
1.4k
MegaParticles: GPUを利用したStein Particle Filterによる点群6自由度姿勢推定
koide3
1
500
Prompt Tuning から Fine Tuning への移行時期推定
icoxfog417
17
6.8k
眠眠ガチャ:ガチャを活用した睡眠意欲向上アプリの開発 / EC71inui
yumulab
0
120
研究効率化Tips_2024 / Research Efficiency Tips 2024
ryo_nakamura
4
2.2k
Featured
See All Featured
XXLCSS - How to scale CSS and keep your sanity
sugarenia
240
1.2M
Code Review Best Practice
trishagee
54
15k
Building Effective Engineering Teams - LeadDev
addyosmani
27
1.8k
Learning to Love Humans: Emotional Interface Design
aarron
266
39k
Easily Structure & Communicate Ideas using Wireframe
afnizarnur
186
16k
Visualizing Your Data: Incorporating Mongo into Loggly Infrastructure
mongodb
34
8.9k
Product Roadmaps are Hard
iamctodd
43
9.7k
Debugging Ruby Performance
tmm1
70
11k
Designing Experiences People Love
moore
136
23k
Understanding Cognitive Biases in Performance Measurement
bluesmoon
6
990
Code Reviewing Like a Champion
maltzj
513
39k
How to train your dragon (web standard)
notwaldorf
72
5.1k
Transcript
Effective Approaches to Attention-based Neural Machine Translation Authors: Minh-Thang LuongɹHieu
PhamɹChristopher D. Manning ಡΉਓ: ಙӬ೭ ਤશͯ͜ͷจ͔ΒҾ༻ &./-1ಡΈձ
ࣗݾհɿಙӬ೭ • Twitter ID: @tkng • εϚʔτχϡʔεגࣜձࣾͰNLPͬͯ·͢
ࠓͷจʁ • Effective Approaches to Attention-based Neural Machine Translation •
ڈ͙Β͍͔ΒྲྀߦΓ࢝Ίͨseq2seqܥͷख ๏ͷ֦ு
Seq2seq modelͱʁ • Encoder/Decoder modelͱݴ͏ • ༁ݩͷจΛݻఆͷϕΫτϧʹΤϯίʔυ ͯ͠ɺ͔ͦ͜Β༁ޙͷจΛσίʔυ͢Δ • ՄมͷσʔλऔΓѻ͍͕͍͠ͷͰɺ
͑ͯݻఆʹͯ͠͠·͏ͱ͍͏ൃ
Ͳ͏ͬͯݻఆʹΤϯίʔυ ͢Δͷʁ • recurrent neural networkΛ͏ • http://colah.github.io/posts/2015-08-Understanding-LSTMs/ • http://kaishengtai.github.io/static/slides/treelstm-acl2015.pdf
• LSTM = recurrent neural networkͷҰछ
Seq2seqϞσϧͰͷ༁
Seq2seq·ͰͷಓͷΓ (1) • Recurrent Continuous Translation Models (EMNLP2013) • Learning
Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation (EMNLP2014)
Seq2seq·ͰͷಓͷΓ (2) • Sequence to Sequence Learning with Neural Networks
(NIPS2014) • ൺֱతγϯϓϧͳStacked LSTM͕ྑ͍ੑೳΛ ࣔ͢͜ͱ͕࣮ݧͰࣔ͞Εͨ • ϏʔϜαʔνɺٯॱͰͷೖྗɺΞϯαϯϒϧ ͷ3छྨͷ͕ೖ͍ͬͯΔ
Seq2seqϞσϧͷऑ • จʹऑ͍ • ݻ༗໊ࢺ͕ೖΕସΘΔ
AttentionʹΑΔվળ [Bahdanau+ 2015] • DecodeͷࡍͷContextʹEncodeͷࡍͷ֤࣌ࠁ ʹ͓͚ΔӅΕঢ়ଶͷॏΈ͖Λ༻͍Δ • ॏΈࣗମRNNͰܭࢉ͢Δ
ࠓճͷจͷߩݙ • ৽͍͠attention (local attention) ΛఏҊͨ͠ • ༁ݩจʹ͓͍ͯɺҐஔɹ͔ΒલޙD୯ޠ ͷӅΕঢ়ଶͷॏΈ͖ΛऔΔ •
ॏΈͷܭࢉglobal attentionͷ߹ͱಉ༷ • ɹ1ͭͣͭਐΊ͍ͯ͘߹ʢlocal-mʣ ͱɺ͜ΕࣗମRNNʹ͢Δ߹ʢlocal- pʣͷ2ͭΛ࣮ݧ͍ͯ͠Δ pt pt
local attention
local attentionͷҹ • ޠॱ͕ࣅ͍ͯΔݴޠؒͰͷ༁ͳΒɺ໌Β͔ ʹ͜ͷํ͕ྑͦ͞͏ • ӳΈ͍ͨʹޠॱ͕େ͖͘ҧ͏߹ɺ Ґஔɹͷਪఆࣗମ͕͍͠λεΫʹͳͬͪΌ ͍ͦ͏… pt
࣮ݧ݁ՌɿWMT'14
࣮ݧ݁ՌɿWMT'14 • Α͘ݟΔͱɺlocal attentionͰͷੑೳ্ +0.9ϙΠϯτ • ଞͷςΫχοΫͰՔ͍ͰΔϙΠϯτ͕ଟ͍
࣮ݧ݁ՌɿWMT'15
͍͔ͭ͘༁αϯϓϧ
·ͱΊ • Seq2seqϞσϧͷ֦ுͱͯ͠ɺlocal attention ΛఏҊͨ͠ • ఏҊख๏͍͔ͭ͘ͷ࣮ݧʹ͓͍ͯɺState of the artͷੑೳΛୡͨ͠
ײ • Local attentionΛඍ • ྨࣅ͢Δख๏ͱ۩ମతʹͲ͏ҧ͏͔͕໌շʹ ॻ͔Ε͓ͯΓɺಡΈ͔ͬͨ͢ • AttentionΛཧղͰ͖ͯΑ͔ͬͨʢখฒײʣ