program Features Simple features Output Mapping from features Mapping from features Additional layers of more abstract features Output *Shaded boxes indicate components that are able to learn from data. Output Mapping from features Output Classic machine learning Rule-based systems Representation learning Deep learning src: Deep Learning, Ian Good Fellow
to a value between 0 and 1. ReLU (Rectified Linear Unit) activation function often works a little better than a smooth function like the sigmoid, while also being significantly easier to compute. Common Activation Functions
Take some image & do some preprocess Feed it to the model *oversimplified Checking how bad the model is • Takes the weights & biases as inputs • Measuring the differences (cost) between the prediction and what really is. • And take the average of it. • E.g: MSE • The model has weight, biases, and activations Calculate the gradient & updating the weight and biases , = 1 σ=1 (ෝ , ) • Chain rule • Partial derivative of the cost function respect to and • Calculate the updated and = = ≔ − ∗ ≔ − ∗
Cat’s mouth Dog’s mouth Dog’s ear = (0 ∗ 0 + ⋯ + −1 ∗ −1 + ) Increase the weight In proportion to activation function, “Neurons that fire together, wire together” 0.67 is a Cat Increase the bias Change activation function In proportion to weight, But we cannot change the activation function
Training Dataset: The sample of data used to fit the model. Validation Dataset: The sample of data that have never been seen by the model and will be used to validate the model while the model still in training
layer that recieves images with the size of 160x160 and 3 channels with relu activation • 2 hidden layers, that consist of 200 and 60 neurons with relu activation • 1 output layer with sigmoid activation will give a probabality of binary classification model
Neural Net, here • In depth to Neural Net, here • Types of Loss function, here Books: • Deep Learning, Ian Goodfellow • Hands-On Machine Learning with Scikit-Learn and TensorFlow, Aurélien Géron techdevguide.withgoogle.com/paths/ machine-learning/ • ML crash course • Kaggle