Upgrade to Pro — share decks privately, control downloads, hide ads and more …

A Real use case of practical AR-AI couple - Size my luggage

A Real use case of practical AR-AI couple - Size my luggage

You may have been reading about AR as the future platform, even though currently you are only able to experience gaming/entertainment use cases about it.

However, there are effective non-gaming examples: Many travel and transport companies are using AR to help you to understand if your luggage fit their limits. In this session, you will discover these apps and explore how to develop them as an Android native app using ARCore, Sceneform and a bit of AI with Firebase ML Kit. At the end you will be able to apply these new skills to new exciting use cases.

Giovanni Laquidara

July 02, 2019

More Decks by Giovanni Laquidara

Other Decks in Programming


  1. A Real use case of practical AR-AI couple Size my

    luggage #DCBerlin19
  2. None
  3. None
  4. @joaolaq https://laquysoft.com/

  5. AR can bring anything to you. AR can *bring* anything

    to you. It adds computer-generated information and objects to your everyday world. @joaolaq
  6. What’s the status of AR (on mobile devices)

  7. What’s the status of AR

  8. What’s the status of AR

  9. What’s the status of AR

  10. AR at Google I/O

  11. AR at Google I/O https://twitter.com/i/status/1125810617774772224

  12. AR at Google I/O

  13. Ikea Place

  14. Ebay

  15. Ebay

  16. Measure is a good tool in Travel too

  17. Hackathon

  18. KLM

  19. EasyJet https://www.youtube.com/watch?v=J-0P6ytH7tQ

  20. Kayak https://www.youtube.com/watch?v=IBwHAhqo0oE

  21. Ryanair https://www.express.co.uk/travel/articles/1115654/ryanair-flights-hand-luggag e-mobile-app-carry-on-cabin-baggage-allowance-size-news

  22. iOS Only

  23. What about Android

  24. https://developers.google.com/ar/


  26. https://developers.google.com/ar/develop/java/augmented-faces/

  27. Again...

  28. ARCore provides SDKs for many of the most popular development

    environments. These SDKs provide native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these capabilities you can build entirely new AR experiences or enhance existing apps with AR features. Use your favourite environment @joaolaq
  29. None
  30. None
  31. None
  32. Sceneform Sceneform is a 3D framework that makes it easy

    for Java developers to build augmented reality apps. • A high-level scene graph API • A realistic physically based renderer provided by Filament • An Android Studio plugin for importing, viewing, and building 3D assets https://developers.google.com/ar/develop/java/sceneform/
  33. Development integration

  34. Android sample integration

  35. Android sample integration

  36. Super impose a shape to compare

  37. Android sample integration dependencies { … // Provides ArFragment, and

    other Sceneform UX resources: implementation "com.google.ar.sceneform.ux:sceneform-ux:1.9.0" }
  38. Android sample integration compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 }

    Sceneform libraries use language constructs from Java 8. Add these compile options if targeting minSdkVersion < 26.
  39. ARFragment 1. Checks whether a compatible version of ARCore is

    installed, prompting the user to install or update as necessary 2. Checks whether the app has access to the camera, and asks the user for permission if it has not yet been granted <fragment android:name="com.google.ar.sceneform.ux.ArFragment" android:id="@+id/ux_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
  40. Anchor // Create the Anchor. Anchor anchor = hitResult.createAnchor(); AnchorNode

    anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene()); // Create the transformable andy and add it to the anchor. TransformableNode andy = new TransformableNode(arFragment.getTransformationSystem()); andy.setParent(anchorNode); andy.setRenderable(andyRenderable); andy.select();
  41. on Tap Listener arFragment.setOnTapArPlaneListener( (HitResult hitResult, Plane plane, MotionEvent motionEvent)

    -> {
  42. Renderable A Renderable is a 3D model that can be

    placed anywhere in the scene and consists of Meshes, Materials and Textures. Renderables can be created from: • Standard Android ViewRenderables are rendered as flat 2D cards in the 3D scene, while maintaining the ability to interact with them via touch. • 3D asset files (OBJ, FBX, glTF) can be imported, converted, and previewed using the Android Studio plugin. For more information, see Import and Preview 3D Assets. • Basic shapes and materials can be programmatically combined to create more complicated objects at runtime.
  43. Renderable ModelRenderable. builder() .setSource(this, R.raw.andy) .build() .thenAccept(renderable -> andyRenderable =

    renderable) .exceptionally ( throwable -> { Toast toast = Toast. makeText(this, "Unable to load andy renderable" , Toast.LENGTH_LONG); toast.setGravity (Gravity.CENTER, 0, 0); toast.show(); return null; });
  44. Shape Factory MaterialFactory.makeOpaqueWithColor (this, com.google.ar.sceneform.rendering.Color (Color.GREEN)) .thenAccept { luggageBB =

    ShapeFactory.makeCube (Vector3(.45f, .56f, .25f), Vector3(0f, 0f, -0.3f), it) }
  45. Shape Factory

  46. Renderable from Layout ViewRenderable.builder() .setView(context, R.layout.luggage_size_length) .build() .thenAccept { renderable

    -> luggageLengthRenderable = renderable } ViewRenderable.builder() .setView(context, R.layout.luggage_size_height) .build() .thenAccept { renderable -> luggageHeightRenderable = renderable }
  47. Get the ArFragment override fun onViewCreated(view: View, savedInstanceState: Bundle?) {

    super.onViewCreated(view, savedInstanceState) sceneformFragment = childFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment
  48. Prepare Renderables // When you build a Renderable, Sceneform loads

    its resources in the background while returning // a CompletableFuture. Call thenAccept(), handle(), or check isDone() before calling get(). ModelRenderable.builder() .setSource(context, Uri.parse(LUGGAGE_MODEL)) .build() .thenAccept { renderable -> luggageRenderable = renderable } .exceptionally { val toast = Toast.makeText(context, "Unable to load andy renderable", Toast.LENGTH_LONG) toast.setGravity(Gravity.CENTER, 0, 0) toast.show() null }
  49. Prepare Renderables ViewRenderable.builder() .setView(context, R.layout.luggage_size_length) .build() .thenAccept { renderable ->

    luggageLengthRenderable = renderable } ViewRenderable.builder() .setView(context, R.layout.luggage_size_height) .build() .thenAccept { renderable -> luggageHeightRenderable = renderable }
  50. Handle event sceneformFragment.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent

    -> luggageRenderable?.let { // Create the Anchor. val anchor = hitResult.createAnchor() val anchorNode = AnchorNode(anchor) anchorNode.setParent(sceneformFragment.arSceneView.scene) // Create the transformable andy and add it to the anchor. val andy = DraggableRotableNode(sceneformFragment.transformationSystem) andy.setParent(anchorNode) andy.renderable = luggageRenderable andy.select()
  51. What kind of Node is it class DraggableRotableNode(transformationSystem: TransformationSystem) :

    BaseTransformableNode(transformationSystem) { /** * Returns the controller that translates this node using a drag gesture. */ val translationController: TranslationController /** * Returns the controller that rotates this node using a twist gesture. */ val rotationController: RotationController
  52. What kind of Node is it init { translationController =

    TranslationController(this, transformationSystem.dragRecognizer) addTransformationController(translationController) rotationController = RotationController(this, transformationSystem.twistRecognizer) addTransformationController(rotationController) }
  53. Demo

  54. Scan the object?

  55. ARKit (iOS) Only

  56. Cannot do it

  57. Ok...more smart

  58. Ok...more smart https://opencv.org/android/

  59. Ok...more smart https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android

  60. Ok...more smart

  61. Firebase to the rescue https://firebase.google.com/products/ml-kit

  62. ML Kit Services

  63. Sample App https://github.com/firebase/mlkit-material-android

  64. Object Detection and Tracking Presented @ I/O 19 https://firebase.google.com/docs/ml-kit/android/detect-objects

  65. Object Detection and Tracking

  66. Configuration and code analysis dependencies { // MLKit Dependencies implementation

    'com.google.firebase:firebase-ml-vision:21.0.0' implementation 'com.google.firebase:firebase-ml-vision-object-detection-model:17.0.0' }
  67. Configuration and code analysis val optionsBuilder = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode(FirebaseVisionObjectDetectorOptions.STREAM_MODE) if

    (PreferenceUtils.isClassificationEnabled(graphicOverlay.context)) { optionsBuilder.enableClassification() } this.detector = FirebaseVision.getInstance().getOnDeviceObjectDetector(optionsBuilder.build())
  68. Configuration and code analysis override fun detectInImage(image: FirebaseVisionImage): Task<List<FirebaseVisionObject>> {

    return detector.processImage(image) }
  69. Configuration and code analysis detectInImage(image) .addOnSuccessListener { results -> Log.d(TAG,

    "Latency is: " + (SystemClock.elapsedRealtime() - startMs)) this@FrameProcessorBase.onSuccess(image, results, graphicOverlay) processLatestFrame(graphicOverlay) } .addOnFailureListener { this@FrameProcessorBase.onFailure(it) }
  70. Configuration and code analysis // Observes changes on the object

    to search, if happens, fire image search request. objectToSearch.observe(this@LiveObjectDetectionActivity, Observer { detectObject -> searchEngine.search(detectObject) { detectedObject, products -> workflowModel.onSearchCompleted(detectedObject, products) } })
  71. Configuration and code analysis fun onSearchCompleted (detectedObject: DetectedObject , products:

    List <Product>) { val titleList = products. map { it.title } if (titleList.contains ("Bag") || titleList.contains ("Baggage") || titleList.contains ("Luggage") || titleList.contains ("Backpack")) { val lConfirmedObject = confirmedObject if (detectedObject != lConfirmedObject ) { // Drops the search result from the object that has lost focus. return } objectIdsToSearch .remove(detectedObject. objectId)
  72. Integration with ARCore/Sceneform @Synchronized @Throws(IOException:: class) internal fun start(scene: Scene?

    ) { this.scene = scene if (scene == null) return graphicOverlay .setTransformInfo (Size(640, 480)) scene.addOnUpdateListener { processingRunnable .setNextFrame (it) } processingThread = Thread(processingRunnable ).apply { processingRunnable .setActive(true) start() } }
  73. Integration with ARCore/Sceneform try { frame.acquireCameraImage().use { image -> if

    (image.format != ImageFormat.YUV_420_888) { throw IllegalArgumentException( "Expected image in YUV_420_888 format, got format " + image.format) } Log.d("Hello", "Image acquired at $frameTime") previewSize = Size(image.width, image.height) val data = ImageConversion.YUV_420_888toNV21(image)
  74. Integration with ARCore/Sceneform try { synchronized(processorLock ) { val rotation

    = ROTATION_90 val frameMetadata = FrameMetadata (previewSize .width, previewSize .height, rotation) data?.let { frameProcessor .process(it, frameMetadata , graphicOverlay ) } } } catch (t: Exception ) { Log.e(TAG, "Exception thrown from receiver." , t) }
  75. Integration problems

  76. Integration problems applicationVariants .all { variant -> variant.getRuntimeConfiguration ().exclude group:

    'com.google.code.findbugs' , module: 'jsr305' variant.getRuntimeConfiguration ().exclude group: 'com.google.j2objc' , module: 'j2objc-annotations' variant.getRuntimeConfiguration ().exclude group: 'org.checkerframework' , module: 'checker-compat-qual' variant.getRuntimeConfiguration ().exclude group: 'org.codehaus.mojo' , module: 'animal-sniffer-annotations' }
  77. Demo https://github.com/joaobiriba/arcoremlkit

  78. Demo

  79. Next steps 1. Proper TF Mobile Model (Using firebase) 2.

    Without Firebase, or with Firebase to create the model 3. Using openCV, identify vertex and measure distances 4. 3d Object detection with custom models
  80. Wait for 3d Object scan

  81. Questions? @joaolaq https://laquysoft.com/