Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
文献紹介 9月3日
Search
Sponsored
·
Your Podcast. Everywhere. Effortlessly.
Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.
→
gumigumi7
September 03, 2018
0
270
文献紹介 9月3日
Pervasive Attention: 2D Convolutional Neural Networks
for Sequence-to-Sequence Prediction
gumigumi7
September 03, 2018
Tweet
Share
More Decks by gumigumi7
See All by gumigumi7
文献紹介 1月24日
gumigumi7
0
250
文献紹介 11月7日
gumigumi7
0
140
文献紹介 10月3日
gumigumi7
0
330
文献紹介 8月10日
gumigumi7
0
130
文献紹介 7月16日
gumigumi7
0
260
文献紹介 6月12日
gumigumi7
0
330
文献紹介 5月16日
gumigumi7
0
190
文献紹介 4月18日
gumigumi7
0
150
文献紹介 12月15日
gumigumi7
0
120
Featured
See All Featured
Leveraging Curiosity to Care for An Aging Population
cassininazir
1
170
Ruling the World: When Life Gets Gamed
codingconduct
0
150
Measuring & Analyzing Core Web Vitals
bluesmoon
9
760
The Web Performance Landscape in 2024 [PerfNow 2024]
tammyeverts
12
1k
Documentation Writing (for coders)
carmenintech
77
5.3k
How to make the Groovebox
asonas
2
1.9k
<Decoding/> the Language of Devs - We Love SEO 2024
nikkihalliwell
1
130
The Mindset for Success: Future Career Progression
greggifford
PRO
0
240
Speed Design
sergeychernyshev
33
1.5k
Distributed Sagas: A Protocol for Coordinating Microservices
caitiem20
333
22k
Unsuck your backbone
ammeep
671
58k
YesSQL, Process and Tooling at Scale
rocio
174
15k
Transcript
() Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
▪ ▪ Maha Elbayad, Laurent Besacier, Jakob Verbeek
▪ Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction. ▪ The SIGNLL Conference on Computational Natural Language Learning. 2018. ▪ ▪ , 2
▪ - A - A ▪ E 3
▪ rn ▪ RNN Decoder c n drn ▪
- E Attention d ▪ Decoder dc d ▪ Attention e c d ▪ Decoder dD 4
▪ )( ) )( ) ▪ CNN ▪ N
N C Attention 5
▪ DenseNet ▪ SoTA
▪ 546(.' ▪ & 2-$ ▪ !/
) Mask$ ▪ #" !2 + ▪ 120 &%
▪ Source Sequence Pooling
▪ Self-Attention ▪ Hpool
▪ Dataset ▪ IWSLT (De-En) ▪ Parameters ▪
! &(')%# ▪ DenseNetlayer size : 24 ▪ DenseNetgrowth rate : 32 ▪ embedding size : 128 ▪ Pooling Attention 2#"
11 ▪ SoTA ▪
12
▪ ▪ TE N T N A C T
▪ S 13