Upgrade to Pro — share decks privately, control downloads, hide ads and more …

What's new in ARCore from IO19

What's new in ARCore from IO19

@shibuya.apk

sasami

June 28, 2019
Tweet

More Decks by sasami

Other Decks in Programming

Transcript

  1. Sessions in I/O’19 • What’s New in ARCore • Designing

    AR Applications • Increasing AR Realism with Lighting • Developing the First AR Experience for Google Maps • Augmenting Faces and Images • AR as a Feature: How to Supercharge Products Using Augmented Reality
  2. Outline for today's presentation • What is ARCore? • what's

    new in ARCore • How can we develop using ARCore?
  3. What is ARCore? • Software development kit for building an

    AR application ◦ Motion tracking ◦ Environmental understanding ◦ Light estimation • Platform ◦ Android Studio ← today’s talk ◦ iOS(with ARKit) ◦ Unity ◦ Unreal
  4. Sceneform • render 3D realistic scenes without having to learn

    OpenGL • develop in AndroidStudio ◦ SDK ◦ Plugins for AndroidStudio • pre-requirement ◦ Android Studio 3.1 or higher ◦ Android SDK Platform version 7.0 (API level 24) or higher
  5. What’s new in ARCore? • HDR environmental API • Updates

    in Augmented faces and images • Model Viewer • AR navigation in Google Map
  6. HDR environmental API • Increase AR Realism with Lightning (Google

    I/O’19) • Shows AR object with accurate shadows, highlights and reflections ◦ Estimate using machine learning with a single camera frame Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner(https://developers.googleblog.com/2019/05/ARCore-IO19.html)
  7. Main Directional Light • detecting light direction ◦ Show the

    reflection of light from the particular of point which is not straight down
  8. Ambient Spherical Harmonics • gives the object a sense of

    the overall intensity of light ◦ makes add more shading of the object
  9. HDR Cubemap • high dynamically cubemap ◦ full specular hightlights

    and reflection ※cubemap...a way of mapping texture of surrounding environment
  10. How to develop? • Using Sceneform ※This api has not

    published yet(late summer for this year)
  11. Directional Light API Config config = session.getConfig(); Config.setLightEstimationMode[LightEstimationMode,ENVIRONMENTAL_HDR]; Session,configure(config); //

    Get light estimation from frame LightEstimate lightEstimate = frame.getLightEstimate(); If (lightEstimate.getState() != LightEstimate_state.VALID) { return; } // Get main light intensity & direction float[] direction = lightEstimate.getMainLightDirection(); float[] intensity = lightEstimate.getMainLightIntensity();
  12. Ambient Spherical Harmonics API Config config = session.getConfig(); Config.setLightEstimationMode[LightEstimationMode,ENVIRONMENTAL_HDR]; Session,configure(config);

    // Get light estimation from frame LightEstimate lightEstimate = frame.getLightEstimate(); If (lightEstimate.getState() != LightEstimate_state.VALID) { return; } float[] harmonics = lightEstimate.getAmbientSphericalHarmonics();
  13. HDR Cubemap Config config = session.getConfig(); Config.setLightEstimationMode[LightEstimationMode,ENVIRONMENTAL_HDR]; Session,configure(config); // Get

    light estimation from frame LightEstimate lightEstimate = frame.getLightEstimate(); If (lightEstimate.getState() != LightEstimate_state.VALID) { return; } Image[] lightmaps = lightEstimate.getHdrCubeMap(); for (int i = 0; i < lightmaps.length; ++1) { app.uploadToTexture(i, lightmaps[i]); }
  14. Augment images • Point 2d images and render 3D models

    according to the images • Enable to track those images simultaneously ◦ Multiple images can be tracked at the same time An example of how the Augmented Images API can be used with moving targets by JD.com(https://developers.googleblog.com/2019/05/ARCore-IO19.html)
  15. Augmented images in Google I/O • Demo at Sandbox ◦

    Shows instruction of coffee machine • AR navigation(from Google I/O 2019 app) ◦ Show 3D signs of names for each spot/building
  16. How it works? • Recognize/track images • Provide 6DoF Pose

    with device camera • Anchor content of image
  17. Recognize/track images • Enable register images as augmented images inside

    app • 20 images simultaneously trackable • Up to 1000 images can be stored in augmented db • All tracking happen on the device
  18. Provide 6DoF Pose with device camera • six degree of

    freedom pose with respect to the image ◦ x,y,z and orientation(pitch, yaw and roll) → this enables to show 3D augmented images with right position (https://ja.wikipedia.org/wiki/6DoF)
  19. Anchor content of image • Overlay images with the right

    position ex) signposts of Google I/O How 3D images can be shown in each position of landmark? landmark position wrt signpost + device position wrt signpost = overlay images with the right position
  20. Steps to develop • Create Augmented Images D/B • Create

    & Configure ARCore session • Query pose, Render content
  21. Create Augmented Images D/B • Augmented Images database ◦ enable

    create a database of storing up to 1,000 images ◦ offline tool enable to use to generate this database ◦ enable to execute it at runtime → Adding images to the database enables to use to trigger showing 3D images
  22. Create & Configure ARCore session private void configureSession() { Config

    config = new Config(session); augImgDatabase = AugumentedImageDatabase.deserialize(session, getAssets().open("sample_database.imgdb")) config.setAugumentedImageDatabase(augImgDatabase); session.configure(config); }
  23. Query pose, Render content public void onFrameUpdate() { Collection<AugmentedImage> updateImages

    = frame.getUpdatedTrackables(AugmentedImage.class); for (AugmentedImage image : updatedImages) { if (image.getTrackingState() == TRACKING) { drawAugumentedImages(image.getIndex(), image.getCenterPose()) ...
  24. Center Pose • a center pose for a face(x,y,z) which

    marks the middle of a head ◦ it’s useful when you render the hat
  25. 468 point 3D face mesh • dense 3D mesh ◦

    useful to paint detailed textures that accurately follow a face ◦ without specialize hardware(depth sensor) by using machine learning
  26. App using Augmented Faces • MakeupPlus ◦ Face effect based

    on ARCore https://play.google.com/store/apps/details?id=com.meitu.makeup&h l=en
  27. Augmented Faces on iOS • one asset runs across both

    iOS and Android ◦ APIs for iOS will be available later this summer
  28. How it works? • machine learning models built on top

    of TensorFlow Lite platform ◦ transfer training ▪ way of training the dataset using neural network ▪ predict both 3D vertices & 2D contours • run on device in real time
  29. Create virtual content • 3D: FBX template file is prepared

    in the SDK ◦ developer-guides/creating-assets-for-augmented-faces • 2D: template texture file is prepared as well ◦ Flatter version of face mesh ◦ `reference_face_texture.png`
  30. Configure augmented face mode public class FaceArFragment extends ArFragment {

    @Override protected Set<Session.Frature> getSessionFeatures() { return EnumSet.of(Session.Feature.FRONT_CAMERA); } @Override protected Config getSessionConfiguration(Session session) { Config config = new Config(session); config.setAugumentedFaceMode(AugumentedFaceMode.MESH3D); return config } }
  31. Configure augmented face mode(3D content) protected void onCreate(Bundle saveInstanceState) {}

    ... // Load the face regions renderable. // Load 3D Contain into scene form ModelRenderable.bundler() .setSource(this, R.raw.fox_faces).build().thenAccept( modelRenderable -> { faceRegionsRenderable = modelRenderable; ... });
  32. Configure augumented face mode(2D) protected void onCreate(Bundle saveInstanceState) {} ...

    // Load the face mesh texture // 2D texture you designed Texture.builder() .setSource(this, R.drawable.flag_texture) .build() .thenAccept(texture -> faceMeshTexture = texture);
  33. Import virtual content OnUpdatedListener onUpdateListener() { ... for (AugumentedFace face:

    faceList) { // Create a face node and add it to the scene. AugumentedFaceNode faceNode = new AugumentedFaceNode(face); faceNode.setParent(scene); } } // Overlay the 3D assets on the face. faceNode.setFaceRegionsRenderable(faceRegionsRenderable);
  34. Codelab <model-viewer src="./my/model.glb" ar> </model-viewer> • Load .gltf file with

    <model-viewer> tag ◦ .gltf is an asset format for WebGL, OpenGL ◦ Enable to use it at runtime
  35. Codelab • Add 3D Models to The Web with <model-viewer>

    ◦ https://awake-pigeon.glitch.me/ ◦
  36. AR navigation in Google Map • Developing the First AR

    Experience for Google Maps ◦ How to get more accuracy to apply AR navigation into Google Map? ▪ Speed of Light ◦ You can try on Pixel smartphones!