Slide 1

Slide 1 text

D E E P L E A R N I N G

Slide 2

Slide 2 text

models AE / SAE RBM / DBN CNN RNN / LSTM Memnet / NTM agenda questions What ? Why ? How ? Next ?

Slide 3

Slide 3 text

what why how next What ? AI technique for learning multiple levels of abstractions directly from raw information

Slide 4

Slide 4 text

what why how next Primitive rule based AI Tailored systems Hand Crafted Program Output Input

Slide 5

Slide 5 text

what why how next Classical machine learning Learning from custom features Hand Crafted Features Learning System Output Input

Slide 6

Slide 6 text

what why how next Deep Learning based AI Learn everything Learned Features (Lower Level) Learned Features (Higher Level) Learning System Output Input

Slide 7

Slide 7 text

No content

Slide 8

Slide 8 text

https://www.youtube.com/watch?v=Q70ulPJW3Gk PPTX  PDF (link to video below)

Slide 9

Slide 9 text

With the capacity to represent the world in signs and symbols, comes the capacity to change it Elizabeth Kolbert (The Sixth Extinction) “

Slide 10

Slide 10 text

Why The buzz ?

Slide 11

Slide 11 text

what why how next Google Trends Deep Learning

Slide 12

Slide 12 text

what why how next

Slide 13

Slide 13 text

Crude timeline of Neural Networks 1950 1980 1990 2000 Perceptron Backprop & Application NN Winter

Slide 14

Slide 14 text

2010 Stacking RBMs Deep Learning fuss

Slide 15

Slide 15 text

HUGE DATA Large Synoptic Survey Telescope (2022) 30 TB/night

Slide 16

Slide 16 text

HUGE CAPABILITIES GPGPU  ~20x speedup Powerful Clusters

Slide 17

Slide 17 text

HUGE SUCCESS Speech, text understanding Robotics / Computer Vision Business / Big Data Artificial General Intelligence (AGI)

Slide 18

Slide 18 text

How its done ?

Slide 19

Slide 19 text

what why how next Shallow Network ℎ ℎ = (, 0) = ′(ℎ, 1) = (, ) minimize

Slide 20

Slide 20 text

what why how next Deep Network

Slide 21

Slide 21 text

what why how next Deep Network More abstract features Stellar performance Vanishing Gradient Overfitting

Slide 22

Slide 22 text

what why how next Autoencoder ℎ Unsupervised Feature Learning

Slide 23

Slide 23 text

what why how next Stacked Autoencoder Y. Bengio et. all; Greedy Layer-Wise Training of Deep Networks

Slide 24

Slide 24 text

what why how next Stacked Autoencoder 1. Unsupervised, layer by layer pretraining 2. Supervised fine tuning

Slide 25

Slide 25 text

what why how next Deep Belief Network 2006 breakthrough Stacking Restricted Boltzmann Machines (RBMs) Hinton, G. E., Osindero, S. and Teh, Y.; A fast learning algorithm for deep belief nets

Slide 26

Slide 26 text

Rethinking Computer Vision

Slide 27

Slide 27 text

what why how next Traditional Image Classification pipeline Feature Extraction (SIFT, SURF etc.) Classifier (SVM, NN etc.)

Slide 28

Slide 28 text

what why how next Convolutional Neural Network Images taken from deeplearning.net

Slide 29

Slide 29 text

what why how next Convolutional Neural Network

Slide 30

Slide 30 text

what why how next Convolutional Neural Network Images taken from deeplearning.net

Slide 31

Slide 31 text

what why how next Convolutional Neural Network

Slide 32

Slide 32 text

what why how next The Starry Night Vincent van Gogh Leon A. Gatys, Alexander S. Ecker and Matthias Bethge; A Neural Algorithm of Artistic Style

Slide 33

Slide 33 text

what why how next

Slide 34

Slide 34 text

what why how next Scene Description CNN + RNN Oriol Vinyals et. all; Show and Tell: A Neural Image Caption Generator

Slide 35

Slide 35 text

Learning Sequences

Slide 36

Slide 36 text

what why how next Recurrent Neural Network Simple Elman Version ℎ ℎ = ( , ℎ−1 , 0, 1) = ′(ℎ , 2)

Slide 37

Slide 37 text

what why how next Long Short Term Memory (LSTM) add memory cells learn access mechanism Sepp Hochreiter and Jürgen Schmidhuber; Long short-term memory

Slide 38

Slide 38 text

No content

Slide 39

Slide 39 text

what why how next

Slide 40

Slide 40 text

what why how next Fooling Deep Networks Anh Nguyen, Jason Yosinski, Jeff Clune; Deep Neural Networks are Easily Fooled

Slide 41

Slide 41 text

Next Cool things to try

Slide 42

Slide 42 text

what why how next Hyperparameter optimization bayesian Optimization methods adadelta, rmsprop . . . Regularization dropout, dither . . .

Slide 43

Slide 43 text

what why how next Attention & Memory NTMs, Memory Networks, Stack RNNs . . . NLP Translation, description

Slide 44

Slide 44 text

what why how next Cognitive Hardware FPGA, GPU, Neuromorphic Chips Scalable DL map-reduce, compute clusters

Slide 45

Slide 45 text

what why how next Deep Reinforcement Learning deepmindish things, deep Q learning Energy models RBMs, DBNs . . .

Slide 46

Slide 46 text

https://www.reddit.com/r/MachineLearning/wiki

Slide 47

Slide 47 text

Theano (Python) | Torch (lua) | Caffe (C++) Github is a friend

Slide 48

Slide 48 text

@AbhinavTushar ?