Understanding MLKit offerings in Android & IOS

Understanding MLKit offerings in Android & IOS

ML Kit is the SDK that makes machine learning features more straightforward for the mobile developers to incorporate in their apps. Since ML Kit has been made a core ingredient of Firebase ,its robust ML APIs can be implemented seamlessly in both android & IOS apps like it’s any other feature such as analytics, crashlytics etc.

The flexibility provided by ML Kit for mobile developers to use On-device or On-Cloud ML Apis as per the use case is quite significant . In this session we will take deep dive in the different ready to use APIs of ML Kit encapsulating the features of Mobile Vision, Google Cloud Vision API, TensorFlow Lite and Neural Network API and also will discuss how custom models can be hosted seamlessly.We will understand new features such as Language Identification & Smart Reply using NLP . Additionally , we will look into the applications of recently announced Object Detection and Tracking APIs along with sneak peek into AutoML Vision Edge using which custom image classification models can be created as per our requirements .

0ef974e38ca4c58a5a342efc3898861b?s=128

Gaurav Bhatnagar

September 25, 2019
Tweet

Transcript

  1. Understanding ML Kit offerings in Android/Ios Gaurav Bhatnagar @bhatnagar_g

  2. Beautiful Crete !!!!

  3. ´ Background ´ ML Kit Stack and Ingredients ´ Base

    APIs ´ Custom Models & AutoML ´ Recap Agenda
  4. Significance of Machine Learning

  5. Achievable – Hard On-Device ML challenges On-Cloud ML Drawbacks In-Flexibility

    in using custom TF-Lite models Underlying mathematics and data science Climbing The ML Mountain
  6. Optimized for mobile Ready to use APIs Making Google ML

    Expertise easily accessible On-Device ML On-Cloud API Custom Models & Auto ML Making Machine Learning normal Introducing Firebase ML Kit SDK
  7. Features of ML Kit SDK ´ Available for both Android

    & IOS devices. ´ Base APIs(Out of the box solutions) now in Vision & NLP ´ Included in Firebase Suite. ´ Flexibility of using custom TF lite models(via dynamic downloads) or building our own models with AutoML Vision Edge.
  8. ML Kit Stack TensorFlow Lite Mobile Vision API Google Cloud

    Vision API Neural Networks iOS Metal API NLP API
  9. ML Base Apis using Mobile Vision and Google Cloud Vision

    API Image Labelling Landmark Recognition Text Recognition Bar Code Scanning Face Detection ML Kit Ingredients Object Detection & Tracking
  10. More Base APIs Language Identification Smart Reply Translation Base ML

    Apis using Natural Language Processing
  11. On-Device On-Cloud Text Recognition Face Detection Barcode Scanning Image Labelling

    Landmark Recognition Smart Replies Language Identification Translation ODT Ease of switching for predefined ML capabilities
  12. Create project in Firebase Download google- services.json to current project

    Enable Cloud API Provide Input Data Applying ML via base/custom models Basic Workflow of ML Kit App
  13. Bitmap Image URI/ File Media. Image ByteArray ByteBuffer Firebase Vision

    Image Mechanisms to process the input in Vision
  14. Capture Image via Camera Converting Image to Bitmap Firebase Vision

    Image FirebaseVisionText Recognizer Processing Firebase Vision Image FirebaseVision Text Lines & Blocks Text Recognition Workflow FirebaseVisionText – textBlocks -> line -> element -> text FirebaseVisionCloudText – page -> block -> paragraph -> word -> symbol
  15. Using Cloud Vision Using On-Device Api Text Recognition Demo Google

    Code labs: https://goo.gl/hyEb3r
  16. Capture Image via Camera Converting Image to Bitmap Firebase Vision

    Image FirebaseVisionLabel Detector Processing Firebase Vision Image FirebaseVision Labels Name,Confidence & Entity ID Image Labelling Workflow VisionLabel DetectorO ptions
  17. On-Device Cloud Pricing Free Free for first 1000 requests every

    month. Label coverage 400+ Labels for each category 10000+ labels in many categories Knowledge graph entity ID support Available Available Features of On-Device and On-Cloud API
  18. Image Labelling Recorded Demo GitHub Repo : https://bit.ly/2RITNsn

  19. Original Pic Using Cloud Api Using onDevice Api Image Labelling

    Demo
  20. Image courtesy : https://goo.gl/8Yg5LX Types of Barcode Supported

  21. Barcode Scanning Workflow Firebase Vision Barcode TYPE_PHONE TYPE_SMS TYPE_URL TYPE_WIFI

    TYPE_CALENDER_EVENT TYPE_CONTACT_ INFO TYPE_DRIVER_LICENSE TYPE_EMAIL TYPE_GEO FirebaseVisionBarcode.SMS FirebaseVisionBarcode.URL FirebaseVisionBarcode.WIFI FirebaseVisionBarcode.CalenderEvent FirebaseVisionBarcode.ContactInfo FirebaseVisionBarcode.DriverLicense FirebaseVisionBarcode.Email FirebaseVisionBarcode.Geo FirebaseVisionBarcode.Phone
  22. Landmark Detection Workflow Firebase Vision Image ByteArray ByteBuffer Image URI/File

    Bitmap Media.Image FirebaseVision CloudLandmark Detector getBoundingBox() getConfidence() getEntityId() getLandmark() FirebaseVisionCloudLandM ark Instances FirebaseVisio nCloudDete ctorOptions getLocations()
  23. Landmark Detection Demo Original Pic Processed Pic

  24. Landmark Detection Recorded Demo GitHub Repo : https://bit.ly/2RITNsn

  25. Face Detection Face Tracking Landmarks Classification Face Detection Components

  26. Face Detection Features FirebaseVisionFace.boundingBox FirebaseVisionFace.rightEyeOpenProb ability FirebaseVisionFaceLandmark.NOSE _BASE FirebaseVisionFaceLandmark.RIGHT_ MOUTH

    FirebaseVisionFace.leftEyeOpenP robability FirebaseVisionFace.smilingProbabil ity FirebaseVisionFaceLandmark.LEFT_ MOUTH FirebaseVisionFaceLandmark.BOTTO M_MOUTH This picture does not specify all face detection features
  27. Face Detection Demo GitHub Repo : https://bit.ly/2RITNsn

  28. Object Detection and Tracking ML Kit ODT Cloud Visual Search

    On-Device Cloud Detect Objects Track Objects Coarse Classification Cloud Vision Product Search (or any solution) Google Code labs(ODT): http://bit.ly/2kQrHji
  29. Block Diagram for On-Device Component Localizer Classifier Tracker ML Kit

    ODT < 10 m sec 50 m sec 50 m sec Firebase Vision Object
  30. Object Detection & Tracking Details VisionObjectDetectorOptions Detector Mode : Single

    Image / Stream Mode Single/Multiple Objects Classification Bounding Box Classifica tion Category Tracking ID (In Stream Mode) Firebase Vision Object The object returned after processing
  31. Language Identification Base ML APIs Ø Recognizes text in 110

    different languages Ø Fast and provides response within 1- 2 ms in Android or IOS Ø identifyLanguage() returns BCP-47 Language code for input text. Ø List of possible languages and their confidence level is also returned. Google Code labs: http://bit.ly/2koCGjI
  32. Smart Replies Base APIs Ø This API provides suggestions based

    on last 10 messages in a conversation. (although it works based on 1 previous message as well) Ø Stateless API : fully runs on device. Ø No Message history kept in memory or on server. Ø Worked closely with textPlus to ensure this API is ready for production. Google Code labs: http://bit.ly/2kR9GkW
  33. Translation API (On-Device 1/2) Ø Translations available between 59 languages

    Ø Same NMT models as Google Translate in offline mode. Ø Offered at no additional cost. Ø Fully functions on device. Google Code labs: http://bit.ly/2mjdyvf
  34. Translation API (On-Device 2/2) Phrase based MT --> Neural Machine

    Translation Discreet Local Decision -> Continuous Global Decision Language packs downloads dynamically. To reduce language pairs we are using English as an intermediate language. Greek English Japenese Πότε σκοπεύετε να πάτε στην παραλία; -> When are you planning to go to the beach? -> “いつビーチに⾏く予定ですか︖” (Itsu bīchi ni iku yoteidesu ka?) )
  35. Translation Workflow TranslatorO ptions (S & D Lang) Text in

    Source Language FirebaseTranslator Download Model if Needed Text in Destination Language FirebaseM odelDownl oadCondit ions
  36. ML Kit + Material Design GitHub Repo : http://bit.ly/MLkitmaterial Object

    Detection & Tracking Barcode Scannig
  37. User TF Lite Model for Inference Host with Firebase Convert

    TF Lite Model Build & Train Custom TF Model ´ Automatic Model Fallback ´ Automatic Model Updates ´ A/B Testing and specializing custom models using remote config (Dynamic Selection). Main Benefits Implementation Path Custom Models ML Kit acts as an API layer for your custom model.
  38. Custom Models Demo Google Code labs: https://goo.gl/92n5dY

  39. AutoML Vision Edge Create your own Data Set AutoML Vision

    Edge On-Demand Image Classification Models ML Kit Training the model TensorFlow Lite Models
  40. Evaluating the Model

  41. AutoML Vision Edge ´ Hence with combination of ML Kit

    & AutoML we can train, refine, evaluate and deploy the model in the mobile app to achieve our objective. ´ Almost 1.8 x faster than handcrafted models ´ Custom Image Classifier : http://bit.ly/2knilLB Google Code labs: http://bit.ly/2mikN6F
  42. Recap Ø Main purpose for ML Kit SDK is to

    minimize the process from ideation to the final delivery of a specific use case. Ø More Base(Ready-to-use) API will continue to get added. Ø More scenarios in Vision, Speech & Text ecospace will continue be covered. Ø Collaboration of ML & Material Design also becomes an important reference point. Ø Simplify the use of mobile optimized custom models with AutoML & custom image classifier .
  43. References ´ https://goo.gl/fovxvH – ML Kit Introduction ´ https://goo.gl/CHfMhU -

    Google IO 2018 ML Kit Video ´ https://goo.gl/v9iFWF - ML Kit series by Joe Birch ´ https://goo.gl/8pVYqj - ML series by Britt Barak ´ https://bit.ly/2OflaMF - ML series by Mark Allision ´ https://bit.ly/2ycT5LW - ML series by Ray Wenderlich ´ https://bit.ly/2Eeu3By - ML series by Harshit Dwivedi ´ http://bit.ly/fbasamples - Github repo for Android Samples ´ http://bit.ly/2m2lbGf - Google IO 2019 ML Kit Video ´ https://bit.ly/2RITNsn - GitHub Sample repo
  44. “AI is probably the most important thing humanity has ever

    worked on. I think of it as something more profound than electricity or fire.” Sundar Pichai “Artificial Intelligence is the New Electricity” — Andrew Ng @bhatnagar_g