in a real world ▸ Place virtual content to real world ▸ Relies on creating a virtual space on a fixed position ▸ Combines motion sensors with video capture
2017. ‣ High-level , Session based API . ‣ It requires A9 or newer processors to perform all the complex computations in real-time. Thus, you’ll need an iPhone SE, iPhone 6s, iPhone 6s Plus, 5th generation iPad, or a newer device running iOS 11.
// Set ourselves as the session delegate mySession.delegate = self // Create a world tracking configuration let configuration = ARWorldTrackingSessionConfiguration() // Run the session mySession.run(configuration)
from session object session.getCurrentWorldMap { worldMap, error in guard let worldMap = worldMap else { showAlert(error) return } } // Load world map and run the configuration let configuration = ARWorldTrackingConfiguration() configuration.initialWorldMap = worldMap session.run(configuration)
known static 2D images ‣ Change position and orientation ‣ Integration with world tracking ‣ Validation Support by Xcode asset catalog Tracking ‣ Multiple image tracking ‣ Images don’t need to be static ‣ Position and orientation for every frame ‣ ARImageTrackingConfiguration
scanned first Using ARObjectScanningConfiguration ‣ Detection of a known static 3D object configuration.detectionObjects ‣ Feature points,Well-textured, rigid, non-reflective Example : museum app,