Slide 1

Slide 1 text

SER594 Human-Computer Interaction Lecture 07 Neural Net Javier Gonzalez-Sanchez, PhD [email protected] javiergs.engineering.asu.edu Office Hours: By appointment

Slide 2

Slide 2 text

Integration 2 Agent Integration Agent Agent System Multimodal

Slide 3

Slide 3 text

Continuous Dimensional Model

Slide 4

Slide 4 text

Continuous Dimensional Model -++ --- --+ +-- ++- -+- --+

Slide 5

Slide 5 text

Continuous Dimensional Model

Slide 6

Slide 6 text

What about? • Skin Conductance • Pupil Size • Heart Rate

Slide 7

Slide 7 text

Pattern Recognition

Slide 8

Slide 8 text

Tools

Slide 9

Slide 9 text

Neural Network | Neuron input output

Slide 10

Slide 10 text

Neural Network | Neuron X1 X2 X3 x4

Slide 11

Slide 11 text

Neural Network | Neuron X1 X2 X3 x4 • Add • Normalize

Slide 12

Slide 12 text

Neural Network | Neuron X1=1 X2=0.5 X3=0.5 X4=-1 • Add • Normalize 0.73

Slide 13

Slide 13 text

Neural Network | Weights X1=1 X2=0.5 X3=0.5 X4=-1 0.37 W1=0.5 W2=1 W3=1 W4=2 0.18 -3.7 W1=0.5 W2=1 W3=-10

Slide 14

Slide 14 text

Neural Network | Weights

Slide 15

Slide 15 text

Neural Network And, How to train the model? Weights are important

Slide 16

Slide 16 text

Neural Network X1 X2 X3 X4 W1 W2 W3 W4 sig (W1*X1 + W2*X2 + W3*X3 + W4*X4)

Slide 17

Slide 17 text

Neural Network X1 X2 X3 X4 W1,1 W2,1 W3,1 W4,1 sig (W11*X1 + W21*X2 + W31*X3 + W41*X4) W1,2 W2,2 W3,2 W4,2 sig (W12*X1 + W22*X2 + W32*X3 + W42*X4) W [i][j] = W[weight][neuron]

Slide 18

Slide 18 text

Neural Network X1 X2 X3 X4 sig (W111*X1 + W121*X2 + W131*X3 + W141*X4) W1,1,2 W1,2,2 W1,3,2 W1,4,2 W [k][i][j] = W[layer][weight] [neuron] sig ( ) sig (W112*X1 + W122*X2 + W132*X3 + W142*X4) sig ( ) W2,1,1 W2,2,1 W2,1,2 W2,2,2 layer_1 layer_2

Slide 19

Slide 19 text

Neural Network | Weights input output

Slide 20

Slide 20 text

Cost Function

Slide 21

Slide 21 text

Back propagation • Partial derivative of the cost function with respect to any weight in the network. • We want to see how the function changes as we let just one of those variables change while holding all the others constant. • Black-box

Slide 22

Slide 22 text

Example

Slide 23

Slide 23 text

Hand writing • 60,000 of training data • 10,000 of test data • Black white digit images of 28X28 pixels • Each pixel contains a number from 0-255 showing the gray scale, 0 while and 255 black. http://ramok.tech/tag/java-neural-networks-example/

Slide 24

Slide 24 text

Hand writing • Input (X) – array 784 numbers 0-255 • 60,000 of these • Output - array of 10 elements (probabilities) • More layer better it is, but the training it will also be much slower. http://ramok.tech/tag/java-neural-networks-example/

Slide 25

Slide 25 text

Hand writing public void train(Integer trainData, Integer testFieldValue) { initSparkSession(); List labeledImages = IdxReader.loadData(trainData); List testLabeledImages = IdxReader.loadTestData(testFieldValue); Dataset train = sparkSession.createDataFrame(labeledImages, LabeledImage.class).checkpoint(); Dataset test = sparkSession.createDataFrame(testLabeledImages,LabeledImage.class).checkpoint(); //in=28x28=784, hidden layers (128,64), out=10 int[] layers = new int[]{784, 128, 64, 10}; MultilayerPerceptronClassifier trainer = new MultilayerPerceptronClassifier() .setLayers(layers) .setBlockSize(128) .setSeed(1234L) .setMaxIter(100); model = trainer.fit(train); // evalOnTest(test); }

Slide 26

Slide 26 text

Homework http://ramok.tech/tag/java-neural-networks-example/

Slide 27

Slide 27 text

Homework Recognize states

Slide 28

Slide 28 text

SER594 – Human Computer Interaction Javier Gonzalez-Sanchez [email protected] Spring 2019 Disclaimer. These slides can only be used as study material for the SER594 course at ASU. They cannot be distributed or used for another purpose.