Upgrade to Pro — share decks privately, control downloads, hide ads and more …

SER594 Lecture 07

SER594 Lecture 07

Human Computer Interaction
Neural Net
(201903)

Javier Gonzalez-Sanchez
PRO

April 23, 2019
Tweet

More Decks by Javier Gonzalez-Sanchez

Other Decks in Programming

Transcript

  1. SER594
    Human-Computer Interaction
    Lecture 07
    Neural Net
    Javier Gonzalez-Sanchez, PhD
    [email protected]
    javiergs.engineering.asu.edu
    Office Hours: By appointment

    View Slide

  2. Integration
    2
    Agent
    Integration
    Agent
    Agent
    System
    Multimodal

    View Slide

  3. Continuous Dimensional Model

    View Slide

  4. Continuous Dimensional Model
    -++ --- --+ +-- ++- -+- --+

    View Slide

  5. Continuous Dimensional Model

    View Slide

  6. What about?
    • Skin Conductance
    • Pupil Size
    • Heart Rate

    View Slide

  7. Pattern Recognition

    View Slide

  8. Tools

    View Slide

  9. Neural Network | Neuron
    input output

    View Slide

  10. Neural Network | Neuron
    X1
    X2
    X3
    x4

    View Slide

  11. Neural Network | Neuron
    X1
    X2
    X3
    x4
    • Add
    • Normalize

    View Slide

  12. Neural Network | Neuron
    X1=1
    X2=0.5
    X3=0.5
    X4=-1
    • Add
    • Normalize
    0.73

    View Slide

  13. Neural Network | Weights
    X1=1
    X2=0.5
    X3=0.5
    X4=-1
    0.37
    W1=0.5
    W2=1
    W3=1
    W4=2
    0.18
    -3.7
    W1=0.5
    W2=1
    W3=-10

    View Slide

  14. Neural Network | Weights

    View Slide

  15. Neural Network
    And,
    How to train the model?
    Weights are
    important

    View Slide

  16. Neural Network
    X1
    X2
    X3
    X4
    W1
    W2
    W3
    W4
    sig (W1*X1 + W2*X2 + W3*X3 + W4*X4)

    View Slide

  17. Neural Network
    X1
    X2
    X3
    X4
    W1,1
    W2,1
    W3,1
    W4,1
    sig (W11*X1 + W21*X2 + W31*X3 + W41*X4)
    W1,2
    W2,2
    W3,2
    W4,2
    sig (W12*X1 + W22*X2 + W32*X3 + W42*X4)
    W [i][j] = W[weight][neuron]

    View Slide

  18. Neural Network
    X1
    X2
    X3
    X4
    sig (W111*X1 +
    W121*X2 +
    W131*X3 +
    W141*X4)
    W1,1,2
    W1,2,2
    W1,3,2
    W1,4,2
    W [k][i][j] = W[layer][weight] [neuron]
    sig (
    )
    sig (W112*X1 +
    W122*X2 +
    W132*X3 +
    W142*X4)
    sig (
    )
    W2,1,1
    W2,2,1
    W2,1,2
    W2,2,2
    layer_1 layer_2

    View Slide

  19. Neural Network | Weights
    input output

    View Slide

  20. Cost Function

    View Slide

  21. Back propagation
    • Partial derivative of the cost function with respect to
    any weight in the network.
    • We want to see how the function changes as we let
    just one of those variables change while holding all
    the others constant.
    • Black-box

    View Slide

  22. Example

    View Slide

  23. Hand writing
    • 60,000 of training data
    • 10,000 of test data
    • Black white digit images of 28X28 pixels
    • Each pixel contains a number from 0-255 showing
    the gray scale, 0 while and 255 black.
    http://ramok.tech/tag/java-neural-networks-example/

    View Slide

  24. Hand writing
    • Input (X) – array 784 numbers 0-255
    • 60,000 of these
    • Output - array of 10 elements (probabilities)
    • More layer better it is, but the training it will also be
    much slower.
    http://ramok.tech/tag/java-neural-networks-example/

    View Slide

  25. Hand writing
    public void train(Integer trainData, Integer testFieldValue) {
    initSparkSession();
    List labeledImages = IdxReader.loadData(trainData);
    List testLabeledImages = IdxReader.loadTestData(testFieldValue);
    Dataset train =
    sparkSession.createDataFrame(labeledImages, LabeledImage.class).checkpoint();
    Dataset test =
    sparkSession.createDataFrame(testLabeledImages,LabeledImage.class).checkpoint();
    //in=28x28=784, hidden layers (128,64), out=10
    int[] layers = new int[]{784, 128, 64, 10};
    MultilayerPerceptronClassifier trainer = new MultilayerPerceptronClassifier()
    .setLayers(layers)
    .setBlockSize(128)
    .setSeed(1234L)
    .setMaxIter(100);
    model = trainer.fit(train);
    // evalOnTest(test);
    }

    View Slide

  26. Homework
    http://ramok.tech/tag/java-neural-networks-example/

    View Slide

  27. Homework
    Recognize states

    View Slide

  28. SER594 – Human Computer Interaction
    Javier Gonzalez-Sanchez
    [email protected]
    Spring 2019
    Disclaimer. These slides can only be used as study material for the SER594 course at ASU.
    They cannot be distributed or used for another purpose.

    View Slide