Overview
TrueDepth Camera
iPhone X and ARKit enable a revolutionary capability for robust face tracking in augmented reality apps. Using the TrueDepth camera, your app can detect the position, topology, and expression of the user’s face, all with high accuracy and in real time, making it easy to apply live selfie effects or use facial expressions to drive a 3D character.
Visual Inertial Odometry
ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with Core Motion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.
Scene Understanding and Lighting Estimation
With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9, A10, and A11 processors. These processors deliver breakthrough performance that enables fast scene understanding and lets you build detailed and compelling virtual content on top of real-world scenes. You can take advantage of the optimizations for ARKit in Metal, SceneKit, and third-party tools like Unity and Unreal Engine.