Upgrade to Pro — share decks privately, control downloads, hide ads and more …

A Real use case of practical AR-AI couple - Size my luggage

A Real use case of practical AR-AI couple - Size my luggage

You may have been reading about AR as the future platform, even though currently you are only able to experience gaming/entertainment use cases about it.

However, there are effective non-gaming examples: Many travel and transport companies are using AR to help you to understand if your luggage fit their limits. In this session, you will discover these apps and explore how to develop them as an Android native app using ARCore, Sceneform and a bit of AI with Firebase ML Kit. At the end you will be able to apply these new skills to new exciting use cases.

Giovanni Laquidara

July 02, 2019
Tweet

More Decks by Giovanni Laquidara

Other Decks in Programming

Transcript

  1. AR can bring anything to you. AR can *bring* anything

    to you. It adds computer-generated information and objects to your everyday world. @joaolaq
  2. KLM

  3. ARCore provides SDKs for many of the most popular development

    environments. These SDKs provide native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these capabilities you can build entirely new AR experiences or enhance existing apps with AR features. Use your favourite environment @joaolaq
  4. Sceneform Sceneform is a 3D framework that makes it easy

    for Java developers to build augmented reality apps. • A high-level scene graph API • A realistic physically based renderer provided by Filament • An Android Studio plugin for importing, viewing, and building 3D assets https://developers.google.com/ar/develop/java/sceneform/
  5. Android sample integration dependencies { … // Provides ArFragment, and

    other Sceneform UX resources: implementation "com.google.ar.sceneform.ux:sceneform-ux:1.9.0" }
  6. Android sample integration compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 }

    Sceneform libraries use language constructs from Java 8. Add these compile options if targeting minSdkVersion < 26.
  7. ARFragment 1. Checks whether a compatible version of ARCore is

    installed, prompting the user to install or update as necessary 2. Checks whether the app has access to the camera, and asks the user for permission if it has not yet been granted <fragment android:name="com.google.ar.sceneform.ux.ArFragment" android:id="@+id/ux_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
  8. Anchor // Create the Anchor. Anchor anchor = hitResult.createAnchor(); AnchorNode

    anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene()); // Create the transformable andy and add it to the anchor. TransformableNode andy = new TransformableNode(arFragment.getTransformationSystem()); andy.setParent(anchorNode); andy.setRenderable(andyRenderable); andy.select();
  9. Renderable A Renderable is a 3D model that can be

    placed anywhere in the scene and consists of Meshes, Materials and Textures. Renderables can be created from: • Standard Android ViewRenderables are rendered as flat 2D cards in the 3D scene, while maintaining the ability to interact with them via touch. • 3D asset files (OBJ, FBX, glTF) can be imported, converted, and previewed using the Android Studio plugin. For more information, see Import and Preview 3D Assets. • Basic shapes and materials can be programmatically combined to create more complicated objects at runtime.
  10. Renderable ModelRenderable. builder() .setSource(this, R.raw.andy) .build() .thenAccept(renderable -> andyRenderable =

    renderable) .exceptionally ( throwable -> { Toast toast = Toast. makeText(this, "Unable to load andy renderable" , Toast.LENGTH_LONG); toast.setGravity (Gravity.CENTER, 0, 0); toast.show(); return null; });
  11. Renderable from Layout ViewRenderable.builder() .setView(context, R.layout.luggage_size_length) .build() .thenAccept { renderable

    -> luggageLengthRenderable = renderable } ViewRenderable.builder() .setView(context, R.layout.luggage_size_height) .build() .thenAccept { renderable -> luggageHeightRenderable = renderable }
  12. Get the ArFragment override fun onViewCreated(view: View, savedInstanceState: Bundle?) {

    super.onViewCreated(view, savedInstanceState) sceneformFragment = childFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment
  13. Prepare Renderables // When you build a Renderable, Sceneform loads

    its resources in the background while returning // a CompletableFuture. Call thenAccept(), handle(), or check isDone() before calling get(). ModelRenderable.builder() .setSource(context, Uri.parse(LUGGAGE_MODEL)) .build() .thenAccept { renderable -> luggageRenderable = renderable } .exceptionally { val toast = Toast.makeText(context, "Unable to load andy renderable", Toast.LENGTH_LONG) toast.setGravity(Gravity.CENTER, 0, 0) toast.show() null }
  14. Prepare Renderables ViewRenderable.builder() .setView(context, R.layout.luggage_size_length) .build() .thenAccept { renderable ->

    luggageLengthRenderable = renderable } ViewRenderable.builder() .setView(context, R.layout.luggage_size_height) .build() .thenAccept { renderable -> luggageHeightRenderable = renderable }
  15. Handle event sceneformFragment.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent

    -> luggageRenderable?.let { // Create the Anchor. val anchor = hitResult.createAnchor() val anchorNode = AnchorNode(anchor) anchorNode.setParent(sceneformFragment.arSceneView.scene) // Create the transformable andy and add it to the anchor. val andy = DraggableRotableNode(sceneformFragment.transformationSystem) andy.setParent(anchorNode) andy.renderable = luggageRenderable andy.select()
  16. What kind of Node is it class DraggableRotableNode(transformationSystem: TransformationSystem) :

    BaseTransformableNode(transformationSystem) { /** * Returns the controller that translates this node using a drag gesture. */ val translationController: TranslationController /** * Returns the controller that rotates this node using a twist gesture. */ val rotationController: RotationController
  17. What kind of Node is it init { translationController =

    TranslationController(this, transformationSystem.dragRecognizer) addTransformationController(translationController) rotationController = RotationController(this, transformationSystem.twistRecognizer) addTransformationController(rotationController) }
  18. Configuration and code analysis dependencies { // MLKit Dependencies implementation

    'com.google.firebase:firebase-ml-vision:21.0.0' implementation 'com.google.firebase:firebase-ml-vision-object-detection-model:17.0.0' }
  19. Configuration and code analysis val optionsBuilder = FirebaseVisionObjectDetectorOptions.Builder() .setDetectorMode(FirebaseVisionObjectDetectorOptions.STREAM_MODE) if

    (PreferenceUtils.isClassificationEnabled(graphicOverlay.context)) { optionsBuilder.enableClassification() } this.detector = FirebaseVision.getInstance().getOnDeviceObjectDetector(optionsBuilder.build())
  20. Configuration and code analysis detectInImage(image) .addOnSuccessListener { results -> Log.d(TAG,

    "Latency is: " + (SystemClock.elapsedRealtime() - startMs)) [email protected](image, results, graphicOverlay) processLatestFrame(graphicOverlay) } .addOnFailureListener { [email protected](it) }
  21. Configuration and code analysis // Observes changes on the object

    to search, if happens, fire image search request. objectToSearch.observe(this@LiveObjectDetectionActivity, Observer { detectObject -> searchEngine.search(detectObject) { detectedObject, products -> workflowModel.onSearchCompleted(detectedObject, products) } })
  22. Configuration and code analysis fun onSearchCompleted (detectedObject: DetectedObject , products:

    List <Product>) { val titleList = products. map { it.title } if (titleList.contains ("Bag") || titleList.contains ("Baggage") || titleList.contains ("Luggage") || titleList.contains ("Backpack")) { val lConfirmedObject = confirmedObject if (detectedObject != lConfirmedObject ) { // Drops the search result from the object that has lost focus. return } objectIdsToSearch .remove(detectedObject. objectId)
  23. Integration with ARCore/Sceneform @Synchronized @Throws(IOException:: class) internal fun start(scene: Scene?

    ) { this.scene = scene if (scene == null) return graphicOverlay .setTransformInfo (Size(640, 480)) scene.addOnUpdateListener { processingRunnable .setNextFrame (it) } processingThread = Thread(processingRunnable ).apply { processingRunnable .setActive(true) start() } }
  24. Integration with ARCore/Sceneform try { frame.acquireCameraImage().use { image -> if

    (image.format != ImageFormat.YUV_420_888) { throw IllegalArgumentException( "Expected image in YUV_420_888 format, got format " + image.format) } Log.d("Hello", "Image acquired at $frameTime") previewSize = Size(image.width, image.height) val data = ImageConversion.YUV_420_888toNV21(image)
  25. Integration with ARCore/Sceneform try { synchronized(processorLock ) { val rotation

    = ROTATION_90 val frameMetadata = FrameMetadata (previewSize .width, previewSize .height, rotation) data?.let { frameProcessor .process(it, frameMetadata , graphicOverlay ) } } } catch (t: Exception ) { Log.e(TAG, "Exception thrown from receiver." , t) }
  26. Integration problems applicationVariants .all { variant -> variant.getRuntimeConfiguration ().exclude group:

    'com.google.code.findbugs' , module: 'jsr305' variant.getRuntimeConfiguration ().exclude group: 'com.google.j2objc' , module: 'j2objc-annotations' variant.getRuntimeConfiguration ().exclude group: 'org.checkerframework' , module: 'checker-compat-qual' variant.getRuntimeConfiguration ().exclude group: 'org.codehaus.mojo' , module: 'animal-sniffer-annotations' }
  27. Next steps 1. Proper TF Mobile Model (Using firebase) 2.

    Without Firebase, or with Firebase to create the model 3. Using openCV, identify vertex and measure distances 4. 3d Object detection with custom models