OVER MT. STUPID TO DEEP LEARNING… Carsten Sandtner \\ @casarock Photo by Archie Binamira from Pexels https://www.pexels.com/photo/man-wearing-white-shirt-brown-shorts-and-green-backpack-standing-on-hill-672358/
ABOUT://ME My name is Carsten and I’m Technical Director at mediaman GmbH in Mayence (Mainz). I’m a moz://a Techspeaker and I love the open web and everything about open standards for the web! I’m tweeting as @casarock
„ “ –Amy Webb - SXSW 2018 The artificial intelligence ecosystem — flooded with capital, hungry for commercial applications, and yet polluted with widespread, misplaced optimism and fear — will continue to swell
HISTORY 1950: Neuronal Networks! ~1980: Machine Learning Today: Deep Learning Photo by Dick Thomas Johnson https://www.flickr.com/photos/[email protected]/14810867549/
SIMPLE EXAMPLE Machine Sensors x1, x2 and x3 Weights a, b and c Y = Const + ax1 + bx2 + cx3 Const: Value when x1, x2 and x3 are 0 Training data: 100.000 Datasets. Keep 25.000 for validation Train with 75.000 -> Vary weights until result (Y) is ok Verify your model with the 25.000 sets for validation
REINFORCEMENT has not a defined result for training data. Using rewards for good results - if it isn’t good do it never again, bad boy! Example: Learn how to play a game just while analyse every pixel Popular Example: Alpha Go
BACKPROPAGATION For every neuron from output to input Check neurons share in the error (using weights) Fine tune weights (in very small steps eg. 0.001) This is a Hyperparameter: Learning Rate! Applies to the whole network Every weight adjusted? Start a new run This is called an new Epoch (next Hyperparameter!)
CONVERGENCE Always take a look at your error. Is it minimal? Stop learning! Validate your model with other data (not the learning data!) Try to avoid overfitting!
SOME BETTER EXAMPLES Categorization of products in an online shop using their images. Using cognitive services for Natural Language Processing (NLP) voice or text based Using a cloud based AI Service!