Back propagation • Partial derivative of the cost function with respect to any weight in the network. • We want to see how the function changes as we let just one of those variables change while holding all the others constant. • Black-box
Hand writing • 60,000 of training data • 10,000 of test data • Black white digit images of 28X28 pixels • Each pixel contains a number from 0-255 showing the gray scale, 0 while and 255 black. http://ramok.tech/tag/java-neural-networks-example/
Hand writing • Input (X) – array 784 numbers 0-255 • 60,000 of these • Output - array of 10 elements (probabilities) • More layer better it is, but the training it will also be much slower. http://ramok.tech/tag/java-neural-networks-example/
SER594 – Human Computer Interaction Javier Gonzalez-Sanchez [email protected] Spring 2019 Disclaimer. These slides can only be used as study material for the SER594 course at ASU. They cannot be distributed or used for another purpose.