Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Building ARCore apps using Sceneform

Building ARCore apps using Sceneform

In this following talk, we’ll learn about the fundamentals of Augmented Reality, ARCore SDK, Sceneform Framework — using simple examples. You will be walking away with an AR app at the end of this talk. (DroidJam Edition)

Hari Vignesh Jayapalan

July 14, 2018
Tweet

More Decks by Hari Vignesh Jayapalan

Other Decks in Programming

Transcript

  1. Hari Vignesh Android/UX Engineer @ Habba.org Agenda - AR Introduction

    - ARCore Fundamentals - ARCore with OpenGL - Build an AR app - Augmented Images
  2. Virtual Reality t Image credits: Deviantart VR is an interactive

    computer-generated experience taking place within a simulated environment, that incorporates mainly auditory and visual, but also other types of sensory feedback like haptic. — Wikipedia
  3. t Augmented Reality Image credits: Deviantart AR is a direct

    or indirect live view of a real-world environment whose elements are “augmented” by computer- generated perceptual information, ideally across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory. 
 — Wikipedia
  4. t Mixed Reality Image credits: ih0.redbubble.net MR, sometimes referred to

    as hybrid reality, is the merging of real and virtual worlds to produce new environments and visualisations where physical and digital objects co-exist and interact in real time. — Wikipedia
  5. ARCore t ARCore is a google platform for building immersive

    augmented reality experiences. It enables our phones to see the world, understand it and also to interact with the objects and pieces of information. Image credits: assets.materialup.com
  6. How does it work? t - Understands the position of

    the phone as it moves, with respect to the world that is subjected to
 - Form that position, it learns and understands what world it is —  building it’s own understanding Image credits: assets.materialup.com
  7. Motion Tracking t - Tracking a user’s motion and ultimately

    their position in 2D and 3D space
 - It allows us to track position changes by identifying and tracking visual feature points from the device’s camera image. Image credits: developers.google.com
  8. Environmental Understanding t - ARCore looks for clusters of feature

    points that appear to lie on common horizontal or vertical surfaces & makes these surfaces available to your app as planes
 - We can use this information to place virtual objects resting on flat surfaces, using meshing Image credits: developers.google.com
  9. Light Estimation t - Provides us with a way to

    estimate the light in a scene. We can then use this information in order to light and shadow AR objects
 - It can detect information about the lighting and provide us with the average intensity and colour correction of a given camera image Image credits: developers.google.com
  10. ARCore with OpenGL Gist : https://bit.ly/2yWag7j - Initialising the Renderer

    - Check if ARCore is installed - Handle ARCore updates, install ARCore and bunch of other exceptions - Check if camera is available - If yes, ask for permission - If all the above goes through, we resume the SurfaceView - Implement GLSurfaceView.Renderer, which has the methods onSurfaceCreated, onSurfaceChanged and onDrawFrame - Initialise all 3D models with shaders and material properties in onSurfaceCreated - Then on onDrawFrame, we set the AR frame, making ARCore to recognise the frame with hitTest - Creating planes when the hit result was successful - Making ARCore to track these points by adding Anchors - Creating point clouds (dots) - Then by touch, we place the rendered objects on the planes
  11. Sceneform t Sceneform provides the high level API for rendering

    3D models using Java. This dramatically increases the velocity that developers can create AR experiences without much of the knowledge of Graphics and OpenGL Image credits: assets.materialup.com
  12. Our Goal t - Create an AR app, to detect

    planes and allow user to place a 3D model
 - Add simple animation to the 3D model
  13. Get the asset - Download the 3D model
 - Sceneform

    tool will render models of the format OBJ, FBX, glTF
 1 You can download free models from poly.google.com, 
 turbosquid.com, cgtrader.com, free3d.com
  14. Exporting the Asset - Studio now has a new folder

    called sampledata
 - It will hold all data that will not go within our APK. 
 - To add the sample data folder, right click app -> new -> sampledata
 - Drop all the contents(Heart.mtl and Heart.obj) to the sampledata folder 3
  15. Exporting the Asset - 2 files are generated, Heart.sfa in

    sampledata folder and Heart.sfb in assets folder
 3
  16. Adjusting Scale - Adjust the scale value of the *.sfa

    file to the required dimension
 - If you are not sure of the size, you can perform trial and error by rendering scene without AR using SceneView 4
  17. Adjusting Scale (Heart.sfa) 4 file: "sampledata/Heart.obj", name: "Heart", scale: 0.0010,

    suggested_collision: { center: { x: 0, y: 0, z: 0, }, size: { x: 1, y: 1, z: 1, }, }
  18. Adding Permissions 5 <application> ... <uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:name="android.hardware.camera.ar"

    android:required="true" /> … </application> <meta-data android:name="com.google.ar.core" 
 android:value="required" />
  19. Creating ARFragment 6 <android.support.constraint.ConstraintLayout> <!-- add the below fragment--> <fragment

    android:id="@+id/sceneform_fragment" android:name="com.google.ar.sceneform.ux.ArFragment" android:layout_width="match_parent" android:layout_height="match_parent" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" app:layout_constraintTop_toTopOf="parent" /> </android.support.constraint.ConstraintLayout>
  20. Creating ARFragment 6 class MainActivity : AppCompatActivity() { lateinit var

    fragment: ArFragment override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) setSupportActionBar(toolbar) fragment = supportFragmentManager .findFragmentById(R.id.sceneform_fragment) } }
  21. Finding the Position 7 private fun addObject(parse: Uri) { val

    frame = fragment.arSceneView.arFrame val point = getScreenCenter() if (frame != null) { val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { placeObject(fragment, hit.createAnchor(), parse) break } } }}
  22. Finding the Position 7 private fun addObject(parse: Uri) { val

    frame = fragment.arSceneView.arFrame val point = getScreenCenter() if (frame != null) { val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { placeObject(fragment, hit.createAnchor(), parse) break } } }}
  23. Finding the Position 7 private fun getScreenCenter(): android.graphics.Point { val

    view = findViewById<View>(android.R.id.content) return Point(view.width / 2, view.height / 2) }
  24. Finding the Position 7 private fun addObject(parse: Uri) { val

    frame = fragment.arSceneView.arFrame val point = getScreenCenter() if (frame != null) { val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { placeObject(fragment, hit.createAnchor(), parse) break } } }}
  25. Finding the Position 7 private fun addObject(parse: Uri) { val

    frame = fragment.arSceneView.arFrame val point = getScreenCenter() if (frame != null) { val hits = frame.hitTest(point.x.toFloat(), point.y.toFloat()) for (hit in hits) { val trackable = hit.trackable if (trackable is Plane && trackable.isPoseInPolygon(hit.hitPose)) { placeObject(fragment, hit.createAnchor(), parse) break } } }}
  26. Placing the Object 8 private fun placeObject(fragment: ArFragment, createAnchor: Anchor,

    model: Uri) { ModelRenderable.builder() .setSource(fragment.context, model).build() .thenAccept { addNodeToScene(fragment, createAnchor, it) } .exceptionally { // handle error } }
  27. Placing the Object 8 private fun addNodeToScene(fragment: ArFragment, createAnchor: Anchor,

    renderable: ModelRenderable) { val anchorNode = AnchorNode(createAnchor) val transformableNode = TransformableNode( fragment.transformationSystem ) transformableNode.renderable = renderable transformableNode.setParent(anchorNode) fragment.arSceneView.scene.addChild(anchorNode) transformableNode.select() }
  28. Rotating the Object # private fun addNodeToScene(fragment: ArFragment, createAnchor: Anchor,

    renderable: ModelRenderable) { val anchorNode = AnchorNode(createAnchor) val rotatingNode = RotatingNode() rotatingNode.renderable = renderable rotatingNode.setParent(anchorNode) fragment.arSceneView.scene.addChild(anchorNode) }
  29. Manual handling of Session i Note: If you don't want

    the fragment to do this automatically, perhaps because your app needs to request additional permissions, you can use ArSceneView directly, as demonstrated in the Sceneform solarsystem sample app. You'll need to perform runtime checks, request permissions, create the AR session, and call setupSession() yourself.
  30. AugmentedImage DB 3 private fun setupAugmentedImageDb(config: Config): Boolean { val

    augmentedImageDatabase = AugmentedImageDatabase(session) val augmentedImageBitmap = loadAugmentedImage() ?: return false augmentedImageDatabase.addImage("qrcode", augmentedImageBitmap) config.augmentedImageDatabase = augmentedImageDatabase return true } private fun loadAugmentedImage(): Bitmap? { try { assets.open("qrcode.png") .use {`is` -> return BitmapFactory.decodeStream(`is`) } } catch (e: IOException) { ... } return null }
  31. Configure Session 4 private fun configureSession() { val config =

    Config(session) if (!setupAugmentedImageDb(config)) { // show error to user } config.updateMode = Config.UpdateMode.LATEST_CAMERA_IMAGE session?.configure(config) }
  32. update Frame 5 private fun onUpdateFrame() { val frame =

    arSceneView.arFrame val updatedAugmentedImages = frame.getUpdatedTrackables( AugmentedImage::class.java) for (augmentedImage in updatedAugmentedImages) { if (augmentedImage.trackingState == TrackingState.TRACKING && augmentedImage.name == "qrcode") { val node = AugmentedImageNode(this, layouts[Random().nextInt(layouts.size)]) node.image = augmentedImage arSceneView.scene.addChild(node) } } }
  33. Augmented Image t - Image DB can store up to

    1000 reference images - Can track up to 20 images simultaneously - Physical image should atleast be 15cm x 15cm - Cannot track moving image - Happens without internet Image credits: assets.materialup.com
  34. Res: Text, Project t - Article: bit.ly/2KJUykW - Dev doc:

    bit.ly/2J6p0jy - Codelab: bit.ly/2m2bqoj - Github: bit.ly/2m6XEkd - Samples: bit.ly/2zmZGq3 Image credits: assets.materialup.com
  35. Res: Videos t - Intro: bit.ly/2KYrt4H - Sceneform: bit.ly/2m4MOeC -

    Design: bit.ly/2ufSy9e Image credits: assets.materialup.com