図 2: ステップ関数 N.B. It is not differentiable. logistic sigmoid function ロジスティックシグモイド関数 ハイパータンジェント関数 ReLU Rectified linear function 整流線形関数 7 2 チェインルール f(x 2 ) = 2x, (1) d dx x g(y) f(x) = f (g (y)) = ( g (y)) = , (2) d dx d dx df dx d dy df dx dg dy y hn hn hn−1 f (hn−x ) = ⋅ ⋅ ⋯ ⋅ . (3) d dhn−x dy dhn hn dhn−1 h n−x+1 dhn−x 7.1 3 信用割当問題 7.1.1 4. 活性化関数( 旧) an activation function y = sign ( N ∑ i=1 wi xi + b) (4) 8 4.1 活性化関数 σ(x) = (1 + e −x ) −1 (5) ϕ(x) = (6) e x − e −x ex + e−x ReLU (x) = max (0, x) (7) log(1 + e x ) f(x) = log(1 + e x ) (8) 8.0.1 活性化関数のグラフ
Filters, or kernels in machine learning, 理解を深めるためには have a look at http://colah.github.io/posts/2014-07-Understanding-Convolutions/ 1959 Receptive Fields of Single Neurones in the Cat's Striate Cortex 1962 Receptive Fields, Binocular Interaction and Functional Architecture in the Cat's Visual Cortex 1968 8.0.2 PDP book (1986) chapter 8 Figure 2 排他的論理和の別解 8.0.3 畳み込み演算 What is convolution? f ∗ g(t) = ∑ a+b=t f(t) ⋅ g(t) (9) f ∗ g(t) = ∑ a f(t) ⋅ g(t − a) (10) f ∗ g(t) = ∑ f(τ ) ⋅ g(t − τ ) (11) f ∗ g(t) = ∫ f(τ )g(t − τ ) dτ (12) 8.1 Hubel and Wiesel, a series of studies
Alex なので AlexNet と呼ぶ。 図 7: GoogLeNet Goog{L}e{N}et と綴るのは LeNet へのオマージュ,レスペクト 図 8: GoogLeNet Inception But, why is this model called as Inception? 12 GoogLeNet(2014) 13 GoogLeNet(2014) 14 GoogLeNet(2014) #2