Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Getting Started with ARCore 1.7

Getting Started with ARCore 1.7

Tutorial of basic usage of ARCore 1.7

357c9e58ecce2865f9eb748192e5143f?s=128

TakashiYoshinaga

March 08, 2019
Tweet

More Decks by TakashiYoshinaga

Other Decks in Technology

Transcript

  1. Getting Started with ARCore 1.7 #AR_Fukuoka

  2. Name: Takashi Yoshinaga Affiliation: Institute of Systems,Information Technologies and Nanotechnologies

    Field of work: Application of AR for med. and edu. SNS: Twitter -> @Tks_Yoshinaga LinkedIn -> tks-yoshinaga
  3. ARCore 【Features】 (1) Motion Tracking for realize marker less AR.

    (2) Light Estimation which can estimate light strength and color. (3) Environmental Understanding to recognize floor and wall. (4) Augmented Image to recognize predefined image marker. (5) Cloud Anchor for sharing AR experience among multiple users. (6) Augmented Faces to recognize face & pose estimation.
  4. Today’s Tutorial 【Features】 (1) Motion Tracking for realize marker less

    AR. (2) Light Estimation which can estimate light strength and color. (3) Environmental Understanding to recognize floor and wall. (4) Augmented Image to recognize predefined image marker. (5) Cloud Anchor for sharing AR experience among multiple users. (6) Augmented Faces to recognize face & pose estimation.
  5. See also below if you want to know Augmented Image

    and Cloud Anchor. https://speakerdeck.com/takashiyoshin aga/lets-start-ar-with-arcore-and-unity
  6. Today’s Tutorial ① Superimposing CG ② Plane Detection and Placing

    CG ③ 3D Drawing with Motion Tracking ④ How to Use Augmented Faces
  7. Preparation • Unity2017.4.15 or later • ARCore SDK 1.7 https://github.com/google-ar/arcore-

    unity-sdk/releases/tag/v1.7.0 • Sample http://arfukuoka.lolipop.jp/arcore_pan asonic/sample.zip
  8. Creating Unity Project (1/2) Click New button after starting Unity

    New
  9. Creating Unity Project (2/2) Click Create Project button after inputting

    project name. 3D Project Name Save Directory Create Project
  10. Importing ARCore SDK to Unity ①Assets ②Import Package → Custom

    Package ③arcore-unity-sdk-xxx ④開く
  11. Tutorial 1: Motion Tracking

  12. Setting up the Camera for ARCore (1/2) Delete Main Camera

  13. Setting up the Camera for ARCore (2/2) ①GoogleARCore → Prefabs

    ② ARCore Device ③Drag & Drop
  14. Putting a Virtual Object (1/2) ①GoogleARCore → Examples → Common

    → Prefabs ② Andy Green Diffuse ③Drag & Drop
  15. Putting a Virtual Object (2/2) ①AndyGreenDiffuse ※Origin of the CG

    is position of smartphone when application was launched ②Put Andy in front of user. Position (x,y,z)=(0,0,0.5)[m]
  16. Save the Project ①File ②Save Scene as...

  17. Save the Project ①New Folder ③Open Sample1 folder and save

    as sample1 ④Save After this operation the content can be saved with Ctrl + S ②Name new folder as Sample1
  18. Let’s install app to smartphone

  19. Build Setting ①File ② Build Settings

  20. Build Setting ②Switch Platform ① Android

  21. Build Setting ②Player Settings ① Internal

  22. Build Setting ①Input Product Name ② Other Settings ③Disable Multithreaded

    Rendering
  23. Build Setting ①Input Package Name 例) com.yourname.test1 ② Set Minimum

    API Level to Android 7.0
  24. Build Setting ①XR Settings ② Enable ARCore Supported

  25. Build and Install ①File ② Build & Run

  26. Build and Install ①Input a name of apk ② Save

  27. Run & Confirm

  28. Tutorial 2: Environmental Understanding

  29. Environmental Understanding (1/5) ①Sample1

  30. Environmental Understanding (2/5) ①Right Click ②Create ③GoogleARCore ④SessionConfig

  31. Environmental Understanding (3/5) Change the file name (ex. sample1)

  32. Environmental Understanding (4/5) ①Click sample1.asset 【Plane Finding Mode】 (1)Disabled (2)Horizontal

    And Vertical (3)Horizontal (4)Vertical Choose a mode from(2)~(4)
  33. Environmental Understanding (5/5) ②Click ARCore Device ①Sample1 ③sample1.asset ④Drag &

    Drop to Session Config
  34. Visualization of Detected Planes (1/4) ①Right Click ② Create Empty

  35. Visualization of Detected Planes (2/4) Change the name of GameObject

    to Controller
  36. Visualization of Detected Planes (3/4) ①Click Controller ②AddComponent ③Detected Plane

    Generator
  37. Visualization of Detected Planes (4/4) ①GoogleARCore → Examples → Common

    → Prefabs ③DetectedPlaneVisualizer ②Controller ④Drag & Drop to Detected Plane Prefab
  38. Run & Confirmation Horizontal Plane Vertical Plane

  39. Putting Object to the Tapped Place ①Controller ②Add Component

  40. Putting Object to the Tapped Place ①New Script ③Create and

    Add ②PutScript
  41. Putting Object to the Tapped Place ①Controller ②Double Click PutScript

  42. Script using System.Collections; using System.Collections.Generic; using UnityEngine; using GoogleARCore; public

    class PutScript : MonoBehaviour { public GameObject andy; //Variable to handle CG(Andy) void Start () { } void Update () { //(1) Detect tap. //(2) Transform 2D position to 3D position of real world. //(3) Put Andy there. } }
  43. Script void Update () { //Return if screen touch is

    not detected if (Input.touchCount < 1 ){ return; } Touch touch = Input.GetTouch(0); //Return if touch state is not swipe. if (touch.phase != TouchPhase.Moved ){ return;} //Calculate touched position of detected plane. TrackableHit hit; TrackableHitFlags filter = TrackableHitFlags.PlaneWithinPolygon; if (Frame.Raycast(touch.position.x, touch.position.y, filter, out hit)) { //Move Andy to pointed position. (Next page) } } touch.position hit
  44. Script if (Frame.Raycast(touch.position.x, touch.position.y, filter, out hit)) { // If

    pointed trackable is plane if (hit.Trackable is DetectedPlane ) { //Set the position and the angle of Andy andy.transform.position = hit.Pose.position; andy.transform.rotation = hit.Pose.rotation; andy.transform.Rotate(0, 180, 0, Space.Self); //Set the anchor to fix Andy object to real space. var anchor = hit.Trackable.CreateAnchor(hit.Pose); andy.transform.parent = anchor.transform; } }
  45. Coupling Virtual Object and Variable ②AndyGreenDiffuse ①Controller ③ Drag &

    Drop to Put Script’s Andy
  46. Run & Confirm You can place Andy on the touched

    position of a detected plane.
  47. Save this project by Ctrl+S

  48. Tutorial 3: 3D Drawing with Motion Tracking

  49. Preparation for Next Tutorial (1/6) Select Sample1 & Ctrl +D

  50. Preparation for Next Tutorial (2/6) ②Rename each file to sample2

    ①sample2
  51. Preparation for Next Tutorial (3/6) ①Double click sample2.unity ②sample2 will

    appear
  52. Preparation for Next Tutorial (4/6) ①ARCoreDevice ②sample2.asset ③Drag & Drop

    into Session Config
  53. Preparation for Next Tutorial (5/6) Delete Andy Dffuse

  54. Preparation for Next Tutorial (6/6) ①Controller ②Click of PutScript ③Remove

    Component
  55. Run & Confirmation Horizontal Plane Vertical Plane

  56. Making Material of Line ①Right Click ②Create ③Material

  57. Making Material of Line ①New Material ②Create white box and

    choose color
  58. Drawing 3D Line with TrailRenderer ①Right Click ②Create Empty

  59. Drawing 3D Line with TrailRenderer ①GameObject ②Add Component

  60. Drawing 3D Line with TrailRenderer Search “trail” 【Trail Renderer】 ▪Material

    Drag & Drop NewMaterial into Element0 ▪Time Change to Infinity ▪MinVertexDistance 0.03 ▪Width 0.01 Double click Trail Renderer
  61. Drawing 3D Line with TrailRenderer Change Shader to Sprites/Default ①GameObject

  62. Run & Confirm ①Scene Line appears while dragging & moving

    GameObject ②GameObject
  63. Enable Drawing Multi-line ②GameObject ①Sample2 ③Drag & Drop

  64. Enable Drawing Multi-line Delete GameObjec

  65. Enable Drawing Multi-line ①Controller ②AddComponent

  66. Enable Drawing Multi-line ②New Script ④Create and Add ③DrawScript ①Clear

    form
  67. Enable Drawing Multi-line ①Controller ②Double Click DrawScript

  68. Script using System.Collections; using System.Collections.Generic; using UnityEngine; using GoogleARCore; public

    class DrawScript : MonoBehaviour { public GameObject obj; //Template of trail drawing object GameObject drawObj; // Object used for actual trail drawing void Start () { } void Update () { //Detect Tap //Instantiate drawing object when screen touch is started. //Draw trail line while user touching screen. } }
  69. Script void Update () { if (Input.touchCount == 1) {

    //Calculate relative position from camera. 0 0 0.1 [m] Vector3 p = Camera.main.transform.TransformPoint(0,0,0.1f); //Touch is started if ( Input.GetTouch(0).phase == TouchPhase.Began) { drawObj = GameObject.Instantiate(obj, p, Quaternion.identity); } //While touching. else if (Input.GetTouch(0).phase == TouchPhase.Stationary){ drawObj.transform.position = p; } } }
  70. Coupling Virtual Object and Variable ②Controller ①Sample2 ④Drag & Drop

    into Obj ③GameObject
  71. Run & Confirmation

  72. Enable Deleting Line List<GameObject> lines = new List<GameObject>(); void Update

    () { if (Input.touchCount == 1) {//カメラ手前10cmの位置を取得 Vector3 p = Camera.main.transform.TransformPoint(0,0,0.1f);//タ if ( Input.GetTouch(0).phase == TouchPhase.Began) { drawObj = GameObject.Instantiate(obj, p, Quaternion.identity); GameObject tmp = GameObject.Instantiate(obj, p, Quaternion.identity); lines.Add(tmp); drawObj = tmp; } //押下中 else if (Input.GetTouch(0).phase == TouchPhase.Stationary){ drawObj.transform.position = p; } } }
  73. Enable Deleting Line void Update () { if (Input.touchCount ==

    1) { //Code for writing trail lines./カメラ手前10cmの位置を取得 } else if (Input.touchCount == 2) { if (Input.GetTouch(0).phase == TouchPhase.Ended) { for(int i = 0; i < lines.Count; i++) { Destroy(lines[i]); lines[i] = null; } lines.Clear(); } } }
  74. Lines will be deleted when you tap a screen of

    smartphone with two fingers!
  75. Save the project by Ctrl+S

  76. Tutorial 4:Augmented Faces

  77. Preparation for Next Tutorial (1/6) Select Sample2 & Ctrl +D

  78. Preparation for Next Tutorial (2/6) ②Change each name to sample3

    ①sample3
  79. Preparation for Next Tutorial (3/6) ①Double click sample3.unity ②sample3 will

    appear
  80. Preparation for Next Tutorial (4/6) ①ARCoreDevice ②sample3.asset ③Drag & Drop

    to Session Config
  81. Preparation for Next Tutorial (5/6) Delete GameObject

  82. Preparation for Next Tutorial (6/6) Delete Controller

  83. Enable Front Camera ①ARCoreDevice ②Chanbe Device Camera Direction to Front

    Facing
  84. Enable Face Detection ②sample3.asset ①sample3 ③Augmented Face Mode

  85. Attach a Mesh to Face ②CreateEmpty ①右クリック

  86. Attach a Mesh to Face Change the name of GameObject

    to FaceMesh
  87. Attach a Mesh to Face ①FaceMesh ②Add Component

  88. Attach a Mesh to Face ①Search Face ②ARCore Augmented Face

    Mesh Filter
  89. Attach a Mesh to Face Check Auto Bind ※Enable auto

    attachment of mesh to face
  90. Adding Mesh Renderer ①FaceMesh ②Add Coponent

  91. Adding Mesh Renderer ②Mesh ③MeshRenderer ①Clear form

  92. Setting Up Material of Face Mesh ①Sample3 ③Create ②Right Click

    ④Material
  93. Setting Up Material of Face Mesh ①NewMaterial ②Shader

  94. Augmented Faces ②NewMaterial ①FaceMesh ②Materials内のElement0にドラッグ&ドロップ

  95. Run & Confirm

  96. Applying a Texture Image to Material ②Texture ※You can use

    image with alpha channel by choosing Transparent ①Unlit
  97. Applying a Texture Image to Material ③Drag&Drop ①Sample Folder ①Sample3

  98. Applying a Texture Image to Material ①NewMaterial

  99. Applying a Texture Image to Material ②Drag&Drop ①Mesh

  100. Run & Confirm You can estimate positional relationship between face

    & texture image.
  101. Using Position of Face Parts FOREHEAD_LEFT FOREHEAD_RIGHT NOSE_TIP 3 parts,

    named FOREHEAD_LEFT/RIGHT & NOSE_TIP, are available in SDK.
  102. Using Position of Face Parts ②CreateEmpty ①Right Click

  103. Using Position of Face Parts ①Change name of GameObject to

    FaceParts
  104. Using Position of Face Parts ①FaceParts ②Add Compornent

  105. Using Position of Face Parts ①Search “Face” ②ARCore Augmented Face

    Rig
  106. Using Position of Face Parts ①Auto Bind

  107. Augmented Faces ①Open FaceParts ②You can see NOSE_TIP, FOREHEAD_LFT/RIHGT

  108. Using Position of Face Parts ①Right click NOSE_TIP ②3D Object

    ③Sphere
  109. Using Position of Face Parts ①Sphere ②Edit Transform Position: all

    0 Scale: all 0.04
  110. Using Position of Face Parts Add sphere as child of

    FOREHEAD_RIGHT and LEFT by the same procedure
  111. Using Position of Face Parts

  112. Modification of Sphere’s Position ① Select Sphere being child of

    FOREHEAD_LEFT ② PositionのY=-0.04
  113. Modification of Sphere’s Position ① Select Sphere being child of

    FOREHEAD_RIGHT ② PositionのY=-0.04
  114. Complete