Shotaro Ishihara
April 18, 2018
260

# 『深層学習』第7章「再帰型ニューラルネット」輪読会資料 / Deep Learning Chapter 7

April 18, 2018

## Transcript

2. ###  2 l RNN#' l RNN    "

 " l RNN &     !( \$%

4. ###  4      We can get

an idea of the quality of the learned feature vectors by displaying them in a 2-D map. 
5. ###  5   \$%"!  '(Bag of Words ')N-gram

We can get an idea of the quality "     #& or  
6. ###  6 l RNN#' l RNN    "

 " l RNN &     !( \$%
7. ###  7 l RNN#' l RNN    "

 " l RNN &     !( \$%

13. ###  13 l RNN#' l RNN    "

 " l RNN &     !( \$%
14. ### RNN 14   xt   zt-1  

y  t →   
15. ### RNN 15  xt   zt-1  y 

 t →   
16. ###  16 l RNN#' l RNN    "

 " l RNN &     !( \$%

 
18. ### BPTT 18 % x #!% d \$ &  y

, ... , y  ' % δ  (   δ  )   *   " t t 1 t k out, t j t
19. ### BPTT 19 δ k out, 1 δ k out, 2

δ k out, 3 δ k out, t

j t

22. ###  22 l RNN#' l RNN    "

 " l RNN &     !( \$%
23. ###  23 l RNN#' l RNN    "

 " l RNN &     !( \$%
24. ### RNN 24 #@10+'<3= 0A; ← &91,?7 &9\$)+/" )  4

*58&90 or :( !.2-  ← RNN%>264
25. ### LSTM 25 '% (Long Short-Term Memory, LSTM) RNN &# →

&# !\$   (+)  "*

30. ###  30 l RNN#' l RNN    "

 " l RNN &     !( \$%

32. ### (HMM) 32  %! \$ "#  \$  "#

%!   
33. ###   33 \$ .)-+ (Connectionist temporal classification, CTC) HMM#

  ! RNN &, %*"(, ' &, 
34. ### CTC 34   X = x , ... ,

x  l = l , … , l   = p( l | X ) 1 t 1 |l|
35. ### CTC 35   l = ‘ab’ t = 6

a, b, , , , a, , , b, , , , , a, , b …
36. ### CTC 36 = p( l | X ) a, b,

, , , a, a, , b, , , , , a, , b … p( l1 | X ) = p( l2 | X ) = p( l3 | X ) = = p(a)*p(b)*p( )*p( ) *p( )*p( ) = p(a)*p(a)*p( )*p(b) *p( )*p( ) = p( )*p( )*p( )*p(a)*p( )*p(b)
37. ###  37 • ;&B(2015):5:#3, .<2 • /%) in \$"#3 E?!(2015):

http://www.slideshare.net/shotarosano5/chapter7-50542830, 2016A8*[email protected]C • Recurrent Neural Networks(2014): http://www.slideshare.net/beam2d/pfi-seminar- 20141030rnn?qid=9e5894c7-f162-4da3-b082-a1e4963689e8&v=&b=&from_search=17, 2016A8*[email protected]C • =86 (2013): 7+,4D19+,4D, 2 •  LSTM  0(>-'(2016): http://qiita.com/t_Signull/items/21b82be280b46f467d1b, 2016A8*[email protected]C • A. Graves(2008): Supervised sequence labelling with Recurrent Neural Networks, PhD thesis, Technische Universität München, https://www.cs.toronto.edu/~graves/preprint.pdf