Deep Learning • Neural Networks from the 90’s rebranded in 2006+ • « Neuron » is a loose inspiration (not important) • Stacked layers of differentiable modules (matrix multiplication, convolution, pooling, element-wise non linear operations…) • Can be trained via gradient descent on large data pairs of input-output examples
Recent success • 2009: state of the art acoustic model for speech recognition • 2011: state of the art road sign classification • 2012: state of the art object classification • 2013/14: end-to-end speech recognition, object detection • 2014/15: state of the art machine translation, getting closer for Natural Language Understanding in general
ImageNet Challenge ILSVRC2014 • 1.2 million images • 1000 classes • Last winner: GoogLeNet now at less than 5% error rate • Used in Google Photos for indexing
Neural Turing Machines • Google DeepMind, October 2014 • Neural Network coupled to external memory (tape) • Analogue to a Turing Machine but differentiable • Can be used to learn to simple programs from example input / output pairs • copy, repeat copy, associative recall, • binary n-grams counts and sort
Conclusion • Deep Learning progress is fast paced • Many applications already in production (e.g. speech, image indexing, face recognition) • Machine Learning is now moving from pattern recognition to higher level reasoning • Generic AI is no longer a swear-word among machine learners