is that we'll have trained an image classifier which can recognize pet breeds at state of the art accuracy. The key to this success is the use of transfer learning, which will be a key platform for much of this course. We also discuss how to set the most important hyper-parameter when training neural networks: the learning rate, using Leslie Smith's fantastic learning rate finder method. Finally, we'll look at the important but rarely discussed topic of labeling, and learn about some of the features that fastai provides for allowing you to easily add labels to your images. https://course.fast.ai/videos/?lesson=1
the organization and integration of data collected from various sources. 17 Techniques You can use techniques like Questionnaires and surveys, conducting interviews, using data scraping and data crawling techniques.
recognize various classes of images. 26 When we subsequently provide a new image as input to the model, it will output the probabilities of the image representing each of the types it was trained on.
Cayaco 0.02 Colimita 0.96 Piedra Lisa 0.01 Ticus 0.00 Paramo 0.01 27 Based on the output, we can see that the classification model has predicted that the image has a high probability of representing a Colimita Beer.
Error/L1 Loss Regression losses: • Hinge Loss/Multi-class SVM Loss • Cross Entropy • Loss/Negative Log Likelihood LOSS FUNCTIONS To know which model is good for our data, we compute the loss function by comparing the predicted outputs to actual output. 32
basing your personal decisions all day long, most of the time without even recognizing the process consciously https://mitsloan.mit.edu/ideas-made-to-matter/how-to-use -algorithms-to-solve-everyday-problems 34
algorithms Second Order Optimization Algorithms • Hessian https://towardsdatascience.com/types-of-optimization-algorithms-used-in-neural-networks-and- ways-to-optimize-gradient-95ae5d39529f 35
to help developers run TensorFlow models on mobile, embedded, and IoT devices. • TensorFlow Lite converter • TensorFlow Lite interpreter TensorFlow Lite converter Converts TensorFlow models into an efficient form for use by the interpreter
// ... compile 'org.tensorflow:tensorflow-lite:+' } TensorFlow Lite interpreter 49 android { aaptOptions { noCompress "tflite" noCompress "lite" } } The TensorFlow Lite interpreter is designed to be lean and fast. The interpreter uses a static graph ordering and a custom (less-dynamic) memory allocator to ensure minimal load, initialization, and execution latency. dependencies settings
AssetManager) { init { interpreter = Interpreter(loadModelFile(assetManager, MODEL_PATH)) labels = loadLabelList(assetManager) ... } } 50 // Name of the model file stored in Assets. const val MODEL_PATH = "graph.lite"; // Name of the label file stored in Assets. const val LABEL_PATH = "labels.txt";
to bytes convertBitmapToByteBuffer(bitmap) // An array to hold inference results, to be feed into Tensorflow Lite as outputs. val recognitions = ArrayList<Result>() val recognitionsSize = Math.min(pq.size, MAX_RESULTS) for (i in 0 until recognitionsSize) recognitions.add(pq.poll()) return@flatMap Single.just(recognitions)
String.format("\n%s: %4.2f", label.key, label.value) // Label (In this case PARAMO) label.key // Value (In this case 1.0) label.value ticus (score=0.00000) paramo (score=1.00000) cayaco (score=0.00000) piedra lisa (score=0.00000) colimita (score=0.00000)