Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Measuring a room with ARKit: experiences and challenges

Adrian Tineo
February 21, 2018

Measuring a room with ARKit: experiences and challenges

In October 2017, Terence, Jan and Adrian took 3 days to tackle the challenge of measuring a room with ARKit technology available in iOS 11 for capable devices. Here is a what they learned.

As presented at the App Builders Zurich meetup on 21st Feb 2018: https://www.meetup.com/App-Builders-Zurich/events/246680649

Adrian Tineo

February 21, 2018
Tweet

More Decks by Adrian Tineo

Other Decks in Technology

Transcript

  1. Why • Solve a real user problem • Continuous education

    of ourselves • Learn a new paradigm
  2. When & Where • 3 Days time (17th-19th October 2017)

    • Isolated from the rest of the world • Coworking space at the Effingerstrasse
  3. ARKit • Augmented reality - Apple technology • Put virtual

    objects in the real space • Use the phone’s camera and sensors to recognise the surroundings
  4. ARKit • Works on all modern devices running iOS 11

    (from iPhone6s and from iPad Pros) • Already available on millions of devices • It’s quite precise in distances • Can recognise horizontal planes or individual points
  5. Use case • Person will sell his house • He

    needs the floorplan for the house he’s selling • He needs the measures of his house
  6. Technical discussion • Main data structures in the SDK •

    Working with planes • Mathematical model
  7. ARKit Ref: https://developer.apple.com/documentation/arkit ARKit combines device motion tracking, camera scene

    capture, to simplify the task of building an AR experience. advanced scene processing, and display conveniences
  8. Content technologies • 3 possible content technologies (views): • SceneKit,

    for basic use cases • SpriteKit, for 2D images • Metal, for custom renderers (Unity/Unreal)
  9. Scene and session class ViewController: UIViewController, ARSCNViewDelegate { @IBOutlet var

    sceneView: ARSCNView! override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.session = ARSession() sceneView.scene = SCNScene() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = .horizontal sceneView.session.run(configuration) } // ... }
  10. SCNNode and ARAnchor • SCNNode is a structural element in

    the hierarchical graph of our SCNScene • Akin to UIView in UIKit
 func addChildNode(_ child: SCNNode) • ARAnchor is a real-world position and orientation where we can place virtual objects in the scene • Rendering SCNNode’s over ARAnchor’s creates the AR experience
  11. Plane detection: basic process 1. ARKit detects feature points in

    the world through the camera of the device 2. If plane detection is enabled, feature points are analyzed for planes 3.If a plane is recognized, an ARPlaneAnchor is reported in delegate method:
 renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) 4. We can place objects in the scene and they will appear as if they were in the plane defined in the real world
  12. Plane detection: practical considerations • Plane detection is slow and

    depends on environmental factors. • The lag to detect the first plane poses a big challenge for the UX. 
 Some apps do a little teaser game to begin, like TweetReality by Oscar Falmer.
 Ref: https://itunes.apple.com/us/app/tweetreality/id1295207318? mt=8 • Some authors do their own heuristics based on cloud of feature points (ARPointCloud). Example: ARuler by duzexu
 Ref: https://github.com/duzexu/ARuler • Plane detection consumes resources, disable when you don’t need it! • New in ARKit 1.5: vertical planes
  13. Limitations • Speed of floor recognition —> slow • Lack

    of precision when you are far • Furniture • Not auto-find the walls, solved in ARKit 1.5
  14. Technical aspects • Difficult to test • Not easy to

    think in 3D • A lot of math • User interface in 3D