This talk is a shorter version of my previous talk at SwiftConf 2017.
It have been given at SwiftParis.
It focus on the SceneKit / ARKit integration (not SpriteKit)
image to render the background of the scene Tracking informations to find device's location, orientation and state Scene informations such as light estimate and location in space 8
Position content using ARAnchor and/or the rootNode (initial camera position) Interact with content using UIGesture or the device FOV/position Convert point back and forth ARKit do the rest 14
INITIAL CAMERA POSITION 16 @rompelstilchen // the root node is updated relative to camera by ARKit let box = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0) box.firstMaterial?.diffuse.contents = UIColor.red let node = SCNNode(geometry: box) node.position = SCNVector3(0, 0, -1) sceneView.scene.rootNode.addChildNod e(node)
REAL WORLD (PLANE) sceneView.session.delegate = self let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = .horizontal sceneView.session.run(configuration) func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { for case _ as ARPlaneAnchor in anchors { print("plane detected") } } @objc func tap(gesture: UITapGestureRecognizer) { let location = sceneView.center let results = sceneView.hitTest(location, types: .existingPlane) if let anchor = results.first?.anchor { let node = box() // !!!: this transform is the center of the plane node.position = SCNVector3Make(anchor.transform.columns.3.x, anchor.transform.columns.3.y, anchor.transform.columns.3.z) sceneView.scene.rootNode.addChildNode(node) } } 17 @rompelstilchen
18 @rompelstilchen let location = gesture.location(in: sceneView) let results = sceneView.hitTest(location, options: nil) if let result = results.first?.node { ... }
19 @rompelstilchen // find the current nodes in the field of view if let pov = sceneView.pointOfView { let nodes = sceneView.nodesInsideFrustum(of: pov) for node in nodes { ... } }