Deep Learning with PyTorch

Deep Learning with PyTorch

Introduction to Machine and Deep Learning basics, and overview of main PyTorch features.

This deck is the introductory overview for the Deep Learning with PyTorch course @ FBK

B15fbecd95ea25d06214feb4ade9d42d?s=128

Valerio Maggio

April 22, 2020
Tweet

Transcript

  1. Deep Learning and Artificial Intelligence Valerio Maggio, PhD with Pytorch

    @leriomaggio valerio.maggio@bristol.ac.uk
  2. Me Pun Who? Background in CS PhD in Machine Learning

    for Software Engineering Research: ML/DL for BioMedicine wearing face masks since…
  3. The AI Revolution

  4. Artificial Intelligence according to Google

  5. Machine Learning Artificial Intelligence Machine Learning Data Science Deep Pattern

    Matching ⛑
  6. ML and Data Science Data Science Venn diagram By Drew

    Conway Data Loading Preprocessing Model learning API Interface (Model Adapted from “What about tests in Machine Learning projects?” Sarah Diot-Girard - EuroSciPy 2019 (toy) Data Science Pipeline
  7. What is Machine Learning

  8. Machine Learning “Machine learning is the science (and art) of

    programming computers so they can learn from data” Aurélien Géron, Hands-on Machine Learning with Scikit-Learn and TensorFlow Source: bit.ly/ml-simple-definition “(ml) focuses on teaching computers how to learn without the need to be programmed for specific tasks” S. Pal & A. Gulli, Deep Learning with Keras
  9. Machine Learning Machine learning teaches machines how to carry out

    tasks by themselves. It is that simple. The complexity comes with the details Louis Pedro Coelho, Building Machine Learning Systems with Python (and that’s probably one of the reason why you’re here :)
  10. The (Machine) Learning is about DATA Data are one of

    the most important part of a ml solution Importance: Data >> Model ? Learning by examples Data Preparation is crucial! data algorithms
  11. BioMedicine: another data case ? Contemporary Life Science is about

    data recent advances in sequencing techs and instruments (e.g. “bio-images”) huge datasets generated at incredible pace from human observation to data analysis cheminformatics (drug discovery) Research Impact —> Social and Human Impact
  12. Why Deep Learning, btw? Subset of ML w/ very specific

    model: (Deep?) Neural Networks State of the art Theory ’50 / ’80 hw acceleration to train (~new) learning structure + composability (2018/19)
  13. What about Deep Learning

  14. Deep Learning A multi-layer feed-forward neural network that starts w/

    an input layer fully connected, which is followed by multiple hidden layer of non-linear transformation
  15. More details…

  16. More details… ReLu | sigmoid | tanh

  17. More details… Repeat for each layer…

  18. More details… Image Classification Task

  19. More details… Summary: A Neural Network is: • Built from

    layers; each of which is: • a matrix multiplication, • then add bias • then apply non-linearity Learn values for parameters; W and b (for each layer using Back-Propagation)
  20. Machine Learning for dummies (a.k.a. ML explained to computer scientists)

    Note: I *am* a computer scientist + Matrix Multiplication Random Number Generation Machine Learning = ( ) t≅2k Deep
  21. Ml / Dl basics in a NutShell

  22. features labels (raw) data ML/DL Model Training Trained Model (unseen)

    data Test Predictions Supervised learning supervision
  23. features labels (raw) data ML/DL Model Training Trained Model (unseen)

    data Test Similarities/likelihood UnSupervised learning
  24. labels (raw) data DL Model Training Trained Model (unseen) data

    Test Predictions supervision features Deep Supervised learning
  25. Supervised Training Loop labels (raw) data Model Parameters Loss loss

    predictions
  26. Supervised Training Loop breakdown.. (raw) Data - a.k.a. Observations /

    Input Items about which we want to predict something. We usually will denote observation with x. Labels - a.k.a. Targets (i.e. Ground Truth) Labels corresponding to observations. These are usually the things being predicted. Following standard notations in ML/DL, we will use y to refer to these. Model f(x) = ˆy A mathematica expression or a function that takes an observation x and predicts the value of its target label. Predictions - a.k.a. Estimates: Values of the Targets generated by the model - usually referred to as ˆy Parameters - a.k.a. Weights (in DL terminology) Parameters of the Model. We will refer to them using the w. Loss Function L(y, ˆy): Function that compares how far off a prediction is from its target for observations in the training data. The loss function assigns a scalar real value called the loss. The lower the value of the loss, the better the model is predicting. The Loss is usually referred to as L Source:D. Rao et al. - Natural Language Processing with PyTorch, O’Reilly 2019
  27. (DL) Terms: everyone on the same page? also ref: bit.ly/nvidia-dl-glossary

    Epochs Batches and mini-batch learning Parameters vs HyperParameters (e.g. weights vs layers) Loss & Optimiser (e.g. Cross Entropy & SGD) Transfer learning Gradient & Backward Propagation Tensor
  28. Python has its say Machine Learning Deep Learning “There should

    be one, and preferably one, way to do it” The Zen of Python
  29. Multiple Frameworks?

  30. If someone tells you …

  31. Deep Learning Frameworks Static Graph Dynamic Graph X b W

    * + σ xTW + b (xTW + b) σ Computational Graph Models Linear (or Dense) + + y L y’ fc1 fc2 fc3 fc4 fc5 + + y1 L y’ fc2 fc3 fc4 fc5 fc1 X1 epoch 1, batch 1 + + L y’ fc2 fc3 fc4 fc5 fc1
  32. Deep Learning Frameworks Static Graph Dynamic Graph X b W

    * + σ xTW + b (xTW + b) σ Computational Graph Models Linear (or Dense) + + y L y’ fc1 fc2 fc3 fc4 fc5 + + L y’ fc2 fc3 fc4 fc5 fc1 + + y2 L y’ fc2 fc3 fc4 fc5 fc1 X2 epoch 1, batch 2
  33. Deep Learning Frameworks Static Graph Dynamic Graph X b W

    * + σ xTW + b (xTW + b) σ Backwards and Gradients Calculation Linear (or Dense) + + y L y’ fc1 fc2 fc3 fc4 fc5 + + y L y’ fc2 fc3 fc1 X fc5 fc4 Backprop Autograd Record
  34. Deep Learning Frameworks Static Graph Dynamic Graph X b W

    * + σ xTW + b (xTW + b) σ Backwards and Gradients Calculation Linear (or Dense) + + y L y’ fc1 fc2 fc3 fc4 fc5 + + y L y’ fc2 fc3 fc1 X fc5 fc4 Record Replay Backprop Autograd &
  35. rundown review of basic PyTorch features we will see soon

    Spoiler Alert
  36. Tensors, NumPy, Devices Numpy-like API tensor -> ndarray tensor <-

    ndarray CUDA support
  37. Neural Module subclassing Definition of layers (i.e. tensors) Definition of

    graph (i.e. network)
  38. Loss and Gradients optimiser criterion & loss backprop & update

  39. Dataset and DataLoader transformers Dataset DataLoader

  40. So….

  41. Deep learning Repository: http://github.com/leriomaggio/deep-learning-pytorch mybinder Colaboratory (more on this in

    few minutes) with PyTorch
  42. Deep Learning Course Approach: Data Scientist Always prefer dev/practical aspects

    (tools & sw) Work on full pipeline (e.g. data preparation) Emphasis on the implementation Perspective: Researcher No off-the-shelf (so no “black-box”) solutions” References and Further Readings to know more features
  43. Let’s roll up our sleeves & get on with the

    hands-on