Pro Yearly is on sale from $80 to $50! »

Customize Your App With MLKit

Customize Your App With MLKit

The best app is one that's customized for your user, and machine learning is one of the best ways to accomplish this. Machine learning can seem like a daunting topic, but Google's MLKit makes it easy. In this talk, we'll go over how you can make use of this tool in your own mobile applications, with special attention to the new Smart Reply and Language Detection. We'll also cover how you can easily create your very own custom models with Auto ML Vision Edge. You'll leave with an understanding of the tools needed to use machine learning in your apps.

2edeb6ac352e7d7ec1fc6587def47042?s=128

Victoria Gonda

November 02, 2019
Tweet

Transcript

  1. Customize Your App With MLKit Victoria Gonda

  2. Hello! I'm Victoria Gonda I'm an Android Engineer at Buffer

    and author on RayWenderlich.com You can find me on Twitter at @TTGonda
  3. MLKit

  4. None
  5. None
  6. On Device In the Cloud

  7. None
  8. Barcode scanning Face detection Image labeling Landmark detection Object detection

    and tracking Text recognition Custom Language ID On device translation Smart reply
  9. Erik Hellman - Machine Learning on mobile with MLKit Øredev

    2018
  10. Vision APIs

  11. Object Detection and Tracking ◍ "Localize and track in real

    time the most prominent object in the live camera feed."
  12. implementation 'com.google.firebase:firebase-ml-vision:24.0.0' implementation 'com.google.firebase:firebase-ml-vision-object- detection-model:19.0.2'

  13. val options = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode( FirebaseVisionObjectDetectorOptions.SINGLE_IMAGE_MODE) .enableMultipleObjects() .enableClassification() .build()

  14. val options = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode( FirebaseVisionObjectDetectorOptions.SINGLE_IMAGE_MODE) .enableMultipleObjects() .enableClassification() .build()

  15. val options = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode( FirebaseVisionObjectDetectorOptions.SINGLE_IMAGE_MODE) .enableMultipleObjects() .enableClassification() .build()

  16. val options = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode( FirebaseVisionObjectDetectorOptions.SINGLE_IMAGE_MODE) .enableMultipleObjects() .enableClassification() .build()

  17. val options = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode( FirebaseVisionObjectDetectorOptions.SINGLE_IMAGE_MODE) .enableMultipleObjects() .enableClassification() .build()

  18. val objectDetector = FirebaseVision.getInstance() .getOnDeviceObjectDetector(options)

  19. val image = FirebaseVisionImage.fromBitmap(selectedImage)

  20. None
  21. objectDetector.processImage(image) .addOnSuccessListener { detectedObjects -> // Process result } .addOnFailureListener

    { e -> // Handle error }
  22. objectDetector.processImage(image) .addOnSuccessListener { detectedObjects -> // Process result } .addOnFailureListener

    { e -> // Handle error }
  23. objectDetector.processImage(image) .addOnSuccessListener { detectedObjects -> // Process result } .addOnFailureListener

    { e -> // Handle error }
  24. firebaseVisionObject.boundingBox firebaseVisionObject.trackingId // null in SINGLE_IMAGE_MODE firebaseVisionObject.classificationCategory firebaseVisionObject.classificationConfidence

  25. // Swift for iOS let options = VisionObjectDetectorOptions() options.detectorMode =

    .singleImage options.shouldEnableMultipleObjects = true options.shouldEnableClassification = true
  26. let objectDetector = Vision.vision() .objectDetector(options: options) let image = VisionImage(image:

    uiImage)
  27. objectDetector.process(image) { detectedObjects, error in guard error == nil else

    { // Error. return } guard let detectedObjects = detectedObjects, !detectedObjects.isEmpty else { // No objects detected. return } // Success. }
  28. None
  29. Barcode Scanning ◍ "Scan and process barcodes."

  30. Face Detection ◍ "Detect faces and facial landmarks."

  31. Image Labeling ◍ "Identify objects, locations, activities, animal species, products,

    and more."
  32. Landmark Detection ◍ "Identify popular landmarks in an image."

  33. Text Recognition ◍ "Recognize and extract text from images."

  34. None
  35. Language APIs

  36. Language ID ◍ "Determine the language of a string of

    text with only a few words."
  37. implementation 'com.google.firebase:firebase-ml-natural- language:22.0.0' implementation 'com.google.firebase:firebase-ml-natural- language-language-id-model:20.0.7'

  38. val options = FirebaseLanguageIdentificationOptions .Builder() .setConfidenceThreshold(0.34f) .build()

  39. val languageIdentifier = FirebaseNaturalLanguage .getInstance().getLanguageIdentification(options)

  40. languageIdentifier.identifyLanguage(text) .addOnSuccessListener { languageCode -> // Use result } .addOnFailureListener

    { // Handle error }
  41. languageIdentifier.identifyLanguage(text) .addOnSuccessListener { languageCode -> // Use result } .addOnFailureListener

    { // Handle error }
  42. if (languageCode == "und") { // Language not confidently detected

    } else { // Use language code }
  43. None
  44. On Device Translation ◍ "Translate text between 58 languages, entirely

    on device."
  45. implementation 'com.google.firebase:firebase-ml-natural- language:22.0.0' implementation 'com.google.firebase:firebase-ml-natural- language-translate-model:20.0.7'

  46. val options = FirebaseTranslatorOptions.Builder() .setSourceLanguage(FirebaseTranslateLanguage.ES) .setTargetLanguage(FirebaseTranslateLanguage.EN) .build()

  47. val options = FirebaseTranslatorOptions.Builder() .setSourceLanguage(FirebaseTranslateLanguage.ES) .setTargetLanguage(FirebaseTranslateLanguage.EN) .build()

  48. val options = FirebaseTranslatorOptions.Builder() .setSourceLanguage(FirebaseTranslateLanguage.ES) .setTargetLanguage(FirebaseTranslateLanguage.EN) .build()

  49. val options = FirebaseTranslatorOptions.Builder() .setSourceLanguage(FirebaseTranslateLanguage.ES) .setTargetLanguage(FirebaseTranslateLanguage.EN) .build()

  50. val spanishEnglishTranslator = FirebaseNaturalLanguage .getInstance().getTranslator(options)

  51. spanishEnglishTranslator.downloadModelIfNeeded() .addOnSuccessListener { // Model downloaded successfully // Okay to

    start translating } .addOnFailureListener { exception -> // Handle error }
  52. spanishEnglishTranslator.downloadModelIfNeeded() .addOnSuccessListener { // Model downloaded successfully // Okay to

    start translating } .addOnFailureListener { exception -> // Handle error }
  53. spanishEnglishTranslator.translate(text) .addOnSuccessListener { translatedText -> // Use translated text }

    .addOnFailureListener { exception -> // Handle error }
  54. spanishEnglishTranslator.translate(text) .addOnSuccessListener { translatedText -> // Use translated text }

    .addOnFailureListener { exception -> // Handle error }
  55. None
  56. Smart Reply ◍ "Generate reply suggestions in text conversations."

  57. implementation 'com.google.firebase:firebase-ml-natural- language:22.0.0' implementation 'com.google.firebase:firebase-ml-natural- language-smart-reply-model:20.0.7'

  58. android { aaptOptions { noCompress "tflite" } }

  59. val conversation = mutableListOf<FirebaseTextMessage>() conversation.add(FirebaseTextMessage.createForLocalUser( "Hi!", System.currentTimeMillis()))

  60. val conversation = mutableListOf<FirebaseTextMessage>() conversation.add(FirebaseTextMessage.createForLocalUser( "Hi!", System.currentTimeMillis()))

  61. conversation.add(FirebaseTextMessage.createForRemoteUser( "It was great meeting you at Øredev!", System.currentTimeMillis(), userId))

    conversation.add(FirebaseTextMessage.createForRemoteUser( "Want to keep in touch?", System.currentTimeMillis(), userId))
  62. val smartReply = FirebaseNaturalLanguage.getInstance() .smartReply

  63. smartReply.suggestReplies(conversation) .addOnSuccessListener { result -> if (result.status == STATUS_NOT_SUPPORTED_LANGUAGE) {

    // The conversation's language isn't supported } else if (result.status == STATUS_SUCCESS) { // Show suggestions } } .addOnFailureListener { // Handle error }
  64. smartReply.suggestReplies(conversation) .addOnSuccessListener { result -> if (result.status == STATUS_NOT_SUPPORTED_LANGUAGE) {

    // The conversation's language isn't supported } else if (result.status == STATUS_SUCCESS) { // Show suggestions } } .addOnFailureListener { // Handle error }
  65. smartReply.suggestReplies(conversation) .addOnSuccessListener { result -> if (result.status == STATUS_NOT_SUPPORTED_LANGUAGE) {

    // The conversation's language isn't supported } else if (result.status == STATUS_SUCCESS) { // Show suggestions } } .addOnFailureListener { // Handle error }
  66. smartReply.suggestReplies(conversation) .addOnSuccessListener { result -> if (result.status == STATUS_NOT_SUPPORTED_LANGUAGE) {

    // The conversation's language isn't supported } else if (result.status == STATUS_SUCCESS) { // Show suggestions } } .addOnFailureListener { // Handle error }
  67. result.suggestions.forEach { // Show suggestion }

  68. None
  69. AutoML Vision Edge

  70. AutoML Vision Edge ◍ "Generate custom image classification models to

    use on device from your own library of images."
  71. None
  72. implementation 'com.google.firebase:firebase-ml-vision:24.0.0' implementation 'com.google.firebase:firebase-ml-vision-automl: 18.0.2'

  73. val remoteModel = FirebaseAutoMLRemoteModel .Builder("recipe_model") .build()

  74. val conditions = FirebaseModelDownloadConditions.Builder() .requireWifi() .build() FirebaseModelManager.getInstance() .download(remoteModel, conditions) .addOnCompleteListener

    { // Success. }
  75. val conditions = FirebaseModelDownloadConditions.Builder() .requireWifi() .build() FirebaseModelManager.getInstance() .download(remoteModel, conditions) .addOnCompleteListener

    { // Success. }
  76. android { aaptOptions { noCompress "tflite" } }

  77. val localModel = FirebaseAutoMLLocalModel.Builder() .setAssetFilePath("manifest.json") .build()

  78. val options = FirebaseVisionOnDeviceAutoMLImageLabelerOptions .Builder(localModel) // or remoteModel .setConfidenceThreshold(0.5f) .build()

    val labeler = FirebaseVision .getInstance() .getOnDeviceAutoMLImageLabeler(options)
  79. val options = FirebaseVisionOnDeviceAutoMLImageLabelerOptions .Builder(localModel) // or remoteModel .setConfidenceThreshold(0.5f) .build()

    val labeler = FirebaseVision .getInstance() .getOnDeviceAutoMLImageLabeler(options)
  80. val image = FirebaseVisionImage.fromBitmap(selectedImage) labeler.processImage(image) .addOnSuccessListener { labels -> //

    Use labels } .addOnFailureListener { e -> // :( }
  81. val image = FirebaseVisionImage.fromBitmap(selectedImage) labeler.processImage(image) .addOnSuccessListener { labels -> //

    Use labels } .addOnFailureListener { e -> // :( }
  82. for (label in labels) { val text = label.text val

    confidence = label.confidence }
  83. AutoML with Object Detection Detect object in image Apply ML

    model Crop image
  84. Case Studies

  85. Zyl

  86. Lose It!

  87. <Your app here />

  88. Thanks! You can find me at @TTGonda & VictoriaGonda.com