Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
文献紹介 9月3日
Search
Sponsored
·
Your Podcast. Everywhere. Effortlessly.
Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.
→
gumigumi7
September 03, 2018
0
270
文献紹介 9月3日
Pervasive Attention: 2D Convolutional Neural Networks
for Sequence-to-Sequence Prediction
gumigumi7
September 03, 2018
Tweet
Share
More Decks by gumigumi7
See All by gumigumi7
文献紹介 1月24日
gumigumi7
0
250
文献紹介 11月7日
gumigumi7
0
140
文献紹介 10月3日
gumigumi7
0
330
文献紹介 8月10日
gumigumi7
0
130
文献紹介 7月16日
gumigumi7
0
260
文献紹介 6月12日
gumigumi7
0
330
文献紹介 5月16日
gumigumi7
0
190
文献紹介 4月18日
gumigumi7
0
150
文献紹介 12月15日
gumigumi7
0
120
Featured
See All Featured
Introduction to Domain-Driven Design and Collaborative software design
baasie
1
590
[Rails World 2023 - Day 1 Closing Keynote] - The Magic of Rails
eileencodes
38
2.7k
職位にかかわらず全員がリーダーシップを発揮するチーム作り / Building a team where everyone can demonstrate leadership regardless of position
madoxten
58
50k
A designer walks into a library…
pauljervisheath
210
24k
Navigating the Design Leadership Dip - Product Design Week Design Leaders+ Conference 2024
apolaine
0
190
ReactJS: Keep Simple. Everything can be a component!
pedronauck
666
130k
sira's awesome portfolio website redesign presentation
elsirapls
0
150
Code Reviewing Like a Champion
maltzj
527
40k
Designing for Performance
lara
610
70k
The Director’s Chair: Orchestrating AI for Truly Effective Learning
tmiket
1
99
Build The Right Thing And Hit Your Dates
maggiecrowley
39
3k
Redefining SEO in the New Era of Traffic Generation
szymonslowik
1
220
Transcript
() Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
▪ ▪ Maha Elbayad, Laurent Besacier, Jakob Verbeek
▪ Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction. ▪ The SIGNLL Conference on Computational Natural Language Learning. 2018. ▪ ▪ , 2
▪ - A - A ▪ E 3
▪ rn ▪ RNN Decoder c n drn ▪
- E Attention d ▪ Decoder dc d ▪ Attention e c d ▪ Decoder dD 4
▪ )( ) )( ) ▪ CNN ▪ N
N C Attention 5
▪ DenseNet ▪ SoTA
▪ 546(.' ▪ & 2-$ ▪ !/
) Mask$ ▪ #" !2 + ▪ 120 &%
▪ Source Sequence Pooling
▪ Self-Attention ▪ Hpool
▪ Dataset ▪ IWSLT (De-En) ▪ Parameters ▪
! &(')%# ▪ DenseNetlayer size : 24 ▪ DenseNetgrowth rate : 32 ▪ embedding size : 128 ▪ Pooling Attention 2#"
11 ▪ SoTA ▪
12
▪ ▪ TE N T N A C T
▪ S 13