Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Android Developers Guide to Machine Learning with MLkit and TensorFlow

Android Developers Guide to Machine Learning with MLkit and TensorFlow

Presented at the Google Developer Agency Day Johannesburg 2018.

Spoke about MLKit, Firebase and Custom TensorFlow models.

Rebecca Franks

June 28, 2018
Tweet

More Decks by Rebecca Franks

Other Decks in Programming

Transcript

  1. Android Developers Guide
    to Machine Learning
    With MLKit, Tensorflow & Firebase

    View Slide

  2. Rebecca Franks
    @riggaroo
    Google Developer Expert
    Android @ Over
    Pluralsight Author
    GDG Johannesburg Organiser

    View Slide

  3. Who considers themselves an
    expert in Machine Learning?

    View Slide

  4. View Slide

  5. What is Machine
    Learning?

    View Slide

  6. Machine learning is an application of
    Artificial Intelligence in which we input
    a lot of data and let the machines learn
    “by themselves”

    View Slide

  7. View Slide

  8. View Slide

  9. View Slide

  10. View Slide

  11. View Slide

  12. View Slide

  13. View Slide

  14. View Slide

  15. Face detection$%
    W
    orks
    offline

    View Slide

  16. View Slide

  17. View Slide

  18. View Slide

  19. View Slide

  20. View Slide

  21. View Slide

  22. Demo

    View Slide

  23. View Slide

  24. ⛩ Landmark detection

    View Slide

  25. View Slide

  26. View Slide

  27. Demo

    View Slide

  28. View Slide

  29. Image Labelling
    W
    orks
    offline

    View Slide

  30. View Slide

  31. Barcode scanning
    W
    orks
    offline

    View Slide

  32. View Slide

  33. View Slide

  34. val options = FirebaseVisionBarcodeDetectorOptions.Builder()
    .setBarcodeFormats(FirebaseVisionBarcode.FORMAT_QR_CODE)
    .build()
    val image = FirebaseVisionImage.fromBitmap(bitmap)
    val detector = FirebaseVision.getInstance()
    .getVisionBarcodeDetector(options)
    detector.detectInImage(image)
    .addOnSuccessListener {
    processedBitmap.postValue(barcodeProcessor.drawBoxes(bitmap, it))

    View Slide

  35. val options = FirebaseVisionBarcodeDetectorOptions.Builder()
    .setBarcodeFormats(FirebaseVisionBarcode.FORMAT_QR_CODE)
    .build()
    val image = FirebaseVisionImage.fromBitmap(bitmap)
    val detector = FirebaseVision.getInstance()
    .getVisionBarcodeDetector(options)
    detector.detectInImage(image)
    .addOnSuccessListener {
    processedBitmap.postValue(barcodeProcessor.drawBoxes(bitmap, it))
    var result = String()
    it.forEach {
    result += "VALUE TYPE: ${it.valueType} Raw Value: ${it.rawValue}"
    textResult.postValue(result)

    View Slide

  36. val image = FirebaseVisionImage.fromBitmap(bitmap)
    val detector = FirebaseVision.getInstance()
    .getVisionBarcodeDetector(options)
    detector.detectInImage(image)
    .addOnSuccessListener {
    processedBitmap.postValue(barcodeProcessor.drawBoxes(bitmap, it))
    var result = String()
    it.forEach {
    result += "VALUE TYPE: ${it.valueType} Raw Value: ${it.rawValue}"
    textResult.postValue(result)
    }
    }.addOnFailureListener{
    textResult.postValue(it.message)
    }

    View Slide

  37. View Slide

  38. OCR
    W
    orks
    offline

    View Slide

  39. View Slide

  40. On Device vs Cloud

    View Slide

  41. private fun doOcrDetection(bitmap: Bitmap){
    val detector = FirebaseVision.getInstance()
    .visionTextDetector
    val firebaseImage = FirebaseVisionImage.fromBitmap(bitmap)
    detector.detectInImage(firebaseImage)
    .addOnSuccessListener {
    processedBitmap.postValue(ocrProcessor.drawBoxes(bitmap, it))
    var result = String()
    it.blocks.forEach {
    result += " " + it.text
    textResult.postValue(result)

    View Slide

  42. private fun doOcrDetection(bitmap: Bitmap){
    val detector = FirebaseVision.getInstance()
    .visionTextDetector
    val firebaseImage = FirebaseVisionImage.fromBitmap(bitmap)
    detector.detectInImage(firebaseImage)
    .addOnSuccessListener {
    processedBitmap.postValue(ocrProcessor.drawBoxes(bitmap, it))
    var result = String()
    it.blocks.forEach {
    result += " " + it.text
    textResult.postValue(result)
    }
    }
    .addOnFailureListener{
    Toast.makeText(/../“Error detecting Text $it”/../)
    }

    View Slide

  43. private fun doOcrDetection(bitmap: Bitmap){
    val detector = FirebaseVision.getInstance()
    .visionTextDetector
    val firebaseImage = FirebaseVisionImage.fromBitmap(bitmap)
    detector.detectInImage(firebaseImage)
    .addOnSuccessListener {
    processedBitmap.postValue(ocrProcessor.drawBoxes(bitmap, it))
    var result = String()
    it.blocks.forEach {
    result += " " + it.text
    textResult.postValue(result)
    }
    }
    .addOnFailureListener{
    Toast.makeText(/../“Error detecting Text $it”/../)
    }

    View Slide

  44. View Slide

  45. Custom Tensorflow
    Models
    W
    orks
    offline

    View Slide

  46. TensorFlow

    View Slide

  47. View Slide

  48. View Slide

  49. Retrain existing model 0
    mobilenet_v1

    View Slide

  50. What if I could tell what kind of chips I was
    eating?

    View Slide

  51. Gather
    Training
    Data
    FFMPEG Folders of
    Images
    Retrain with
    new images
    Optimize
    for mobile
    Embed in app
    Store in Firebase
    App
    uses
    model

    View Slide

  52. Gather training
    data

    View Slide

  53. Export to Images using ffmpeg
    ffmpeg -i flings.mp4 flings/flings_%04d.jpg

    View Slide

  54. Folders of images

    View Slide

  55. Retrain with new images
    python -m scripts.retrain \
    --bottleneck_dir=tf_files/bottlenecks \
    --how_many_training_steps=500 \
    --model_dir=tf_files/models/ \
    --summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \
    --output_graph=tf_files/retrained_graph.pb \
    --output_labels=tf_files/retrained_labels.txt \
    --architecture="${ARCHITECTURE}" \
    --image_dir=training_data/south_african_chips

    View Slide

  56. Optimize for mobile
    bazel-bin/tensorflow/contrib/lite/toco/toco \
    --input_file=AgencyDay/retrained_graph.pb \
    --output_file=AgencyDay/chips_optimized_graph.tflite \
    --input_format=TENSORFLOW_GRAPHDEF \
    --output_format=TFLITE \
    --input_shape=1,${IMAGE_SIZE},${IMAGE_SIZE},3 \
    --input_array=input \
    --output_array=final_result \
    --inference_type=FLOAT \
    --input_data_type=FLOAT

    View Slide

  57. Insert into App
    https://codelabs.developers.google.com/codelabs/tensorflow-for-poets-2-tflite/
    https://github.com/googlecodelabs/tensorflow-for-poets-2

    View Slide

  58. Nik Nak 1
    Or
    Not 2
    bit.ly/mlkit-riggaroo

    View Slide

  59. What if our model
    changes?

    View Slide

  60. Ship an app update 3
    And hope that people download it 4

    View Slide

  61. Host on Firebase
    Updates automatically downloaded

    View Slide

  62. val cloudSource = FirebaseCloudModelSource.Builder("my_cloud_model")
    .enableModelUpdates(true)
    .setInitialDownloadConditions(conditions)
    .setUpdatesDownloadConditions(conditions)
    .build()
    FirebaseModelManager.getInstance()
    .registerCloudModelSource(cloudSource)
    ……

    View Slide

  63. g.co/codelabs/mlkit-android-custom-model

    View Slide

  64. You don’t need to be a ML Expert to
    take advantage of ML in your apps!

    View Slide

  65. Thank you!

    View Slide

  66. Resources
    - https://codelabs.developers.google.com/
    codelabs/tensorflow-for-poets
    - https://codelabs.developers.google.com/
    codelabs/tensorflow-for-poets-2-tflite/
    - https://codelabs.developers.google.com/
    codelabs/mlkit-android-custom-model/
    #0
    - https://github.com/riggaroo/android-
    demo-mlkit

    View Slide