A configuration that tracks the movement and expressions of the user’s face using the TrueDepth camera.
SDK
- iOS 11.0+
Framework
- ARKit
Overview
A face tracking configuration detects the user’s face in view of the device’s front-facing camera. When running this configuration, an AR session detects the user's face (if visible in the front-facing camera image) and adds to its list of anchors an ARFace object representing the face. Each face anchor provides information about the face’s position and orientation, its topology, and features that describe facial expressions.
Important
Face tracking is available only on iOS devices with a front-facing TrueDepth camera (see iOS Device Compatibility Reference). Use the ARFace is property to determine whether face tracking is available on the current device before offering the user any features that require face tracking.
The ARFace class provides no methods or properties, but supports all properties inherited from its superclass ARConfiguration. Additionally, when you enable the is setting, a face tracking configuration uses the detected face as a light probe and provides an estimate of directional or environmental lighting (an ARDirectional object).